Dec 6 01:46:54 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Dec 6 01:46:54 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Dec 6 01:46:54 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Dec 6 01:46:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 6 01:46:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 6 01:46:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 6 01:46:54 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 6 01:46:54 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Dec 6 01:46:54 localhost kernel: signal: max sigframe size: 1776 Dec 6 01:46:54 localhost kernel: BIOS-provided physical RAM map: Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Dec 6 01:46:54 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Dec 6 01:46:54 localhost kernel: NX (Execute Disable) protection: active Dec 6 01:46:54 localhost kernel: SMBIOS 2.8 present. Dec 6 01:46:54 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Dec 6 01:46:54 localhost kernel: Hypervisor detected: KVM Dec 6 01:46:54 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 6 01:46:54 localhost kernel: kvm-clock: using sched offset of 1913577242 cycles Dec 6 01:46:54 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 6 01:46:54 localhost kernel: tsc: Detected 2799.998 MHz processor Dec 6 01:46:54 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Dec 6 01:46:54 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 6 01:46:54 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Dec 6 01:46:54 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Dec 6 01:46:54 localhost kernel: Using GB pages for direct mapping Dec 6 01:46:54 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Dec 6 01:46:54 localhost kernel: ACPI: Early table checksum verification disabled Dec 6 01:46:54 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Dec 6 01:46:54 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 6 01:46:54 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 6 01:46:54 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 6 01:46:54 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Dec 6 01:46:54 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 6 01:46:54 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 6 01:46:54 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Dec 6 01:46:54 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Dec 6 01:46:54 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Dec 6 01:46:54 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Dec 6 01:46:54 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Dec 6 01:46:54 localhost kernel: No NUMA configuration found Dec 6 01:46:54 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Dec 6 01:46:54 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Dec 6 01:46:54 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Dec 6 01:46:54 localhost kernel: Zone ranges: Dec 6 01:46:54 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 6 01:46:54 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Dec 6 01:46:54 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 6 01:46:54 localhost kernel: Device empty Dec 6 01:46:54 localhost kernel: Movable zone start for each node Dec 6 01:46:54 localhost kernel: Early memory node ranges Dec 6 01:46:54 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Dec 6 01:46:54 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Dec 6 01:46:54 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Dec 6 01:46:54 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Dec 6 01:46:54 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 6 01:46:54 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Dec 6 01:46:54 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Dec 6 01:46:54 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Dec 6 01:46:54 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 6 01:46:54 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 6 01:46:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 6 01:46:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 6 01:46:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 6 01:46:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 6 01:46:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 6 01:46:54 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 6 01:46:54 localhost kernel: TSC deadline timer available Dec 6 01:46:54 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Dec 6 01:46:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Dec 6 01:46:54 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Dec 6 01:46:54 localhost kernel: Booting paravirtualized kernel on KVM Dec 6 01:46:54 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 6 01:46:54 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Dec 6 01:46:54 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Dec 6 01:46:54 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Dec 6 01:46:54 localhost kernel: Fallback order for Node 0: 0 Dec 6 01:46:54 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Dec 6 01:46:54 localhost kernel: Policy zone: Normal Dec 6 01:46:54 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Dec 6 01:46:54 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Dec 6 01:46:54 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 6 01:46:54 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 6 01:46:54 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 6 01:46:54 localhost kernel: software IO TLB: area num 8. Dec 6 01:46:54 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Dec 6 01:46:54 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Dec 6 01:46:54 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Dec 6 01:46:54 localhost kernel: ftrace: allocating 44803 entries in 176 pages Dec 6 01:46:54 localhost kernel: ftrace: allocated 176 pages with 3 groups Dec 6 01:46:54 localhost kernel: Dynamic Preempt: voluntary Dec 6 01:46:54 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Dec 6 01:46:54 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Dec 6 01:46:54 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Dec 6 01:46:54 localhost kernel: #011Rude variant of Tasks RCU enabled. Dec 6 01:46:54 localhost kernel: #011Tracing variant of Tasks RCU enabled. Dec 6 01:46:54 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 6 01:46:54 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Dec 6 01:46:54 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Dec 6 01:46:54 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 6 01:46:54 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Dec 6 01:46:54 localhost kernel: random: crng init done (trusting CPU's manufacturer) Dec 6 01:46:54 localhost kernel: Console: colour VGA+ 80x25 Dec 6 01:46:54 localhost kernel: printk: console [tty0] enabled Dec 6 01:46:54 localhost kernel: printk: console [ttyS0] enabled Dec 6 01:46:54 localhost kernel: ACPI: Core revision 20211217 Dec 6 01:46:54 localhost kernel: APIC: Switch to symmetric I/O mode setup Dec 6 01:46:54 localhost kernel: x2apic enabled Dec 6 01:46:54 localhost kernel: Switched APIC routing to physical x2apic. Dec 6 01:46:54 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Dec 6 01:46:54 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Dec 6 01:46:54 localhost kernel: pid_max: default: 32768 minimum: 301 Dec 6 01:46:54 localhost kernel: LSM: Security Framework initializing Dec 6 01:46:54 localhost kernel: Yama: becoming mindful. Dec 6 01:46:54 localhost kernel: SELinux: Initializing. Dec 6 01:46:54 localhost kernel: LSM support for eBPF active Dec 6 01:46:54 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 6 01:46:54 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 6 01:46:54 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 6 01:46:54 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Dec 6 01:46:54 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Dec 6 01:46:54 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 6 01:46:54 localhost kernel: Spectre V2 : Mitigation: Retpolines Dec 6 01:46:54 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Dec 6 01:46:54 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Dec 6 01:46:54 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Dec 6 01:46:54 localhost kernel: RETBleed: Mitigation: untrained return thunk Dec 6 01:46:54 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 6 01:46:54 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 6 01:46:54 localhost kernel: Freeing SMP alternatives memory: 36K Dec 6 01:46:54 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Dec 6 01:46:54 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Dec 6 01:46:54 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Dec 6 01:46:54 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Dec 6 01:46:54 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Dec 6 01:46:54 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Dec 6 01:46:54 localhost kernel: ... version: 0 Dec 6 01:46:54 localhost kernel: ... bit width: 48 Dec 6 01:46:54 localhost kernel: ... generic registers: 6 Dec 6 01:46:54 localhost kernel: ... value mask: 0000ffffffffffff Dec 6 01:46:54 localhost kernel: ... max period: 00007fffffffffff Dec 6 01:46:54 localhost kernel: ... fixed-purpose events: 0 Dec 6 01:46:54 localhost kernel: ... event mask: 000000000000003f Dec 6 01:46:54 localhost kernel: rcu: Hierarchical SRCU implementation. Dec 6 01:46:54 localhost kernel: rcu: #011Max phase no-delay instances is 400. Dec 6 01:46:54 localhost kernel: smp: Bringing up secondary CPUs ... Dec 6 01:46:54 localhost kernel: x86: Booting SMP configuration: Dec 6 01:46:54 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Dec 6 01:46:54 localhost kernel: smp: Brought up 1 node, 8 CPUs Dec 6 01:46:54 localhost kernel: smpboot: Max logical packages: 8 Dec 6 01:46:54 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Dec 6 01:46:54 localhost kernel: node 0 deferred pages initialised in 20ms Dec 6 01:46:54 localhost kernel: devtmpfs: initialized Dec 6 01:46:54 localhost kernel: x86/mm: Memory block size: 128MB Dec 6 01:46:54 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 6 01:46:54 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Dec 6 01:46:54 localhost kernel: pinctrl core: initialized pinctrl subsystem Dec 6 01:46:54 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 6 01:46:54 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 6 01:46:54 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 6 01:46:54 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 6 01:46:54 localhost kernel: audit: initializing netlink subsys (disabled) Dec 6 01:46:54 localhost kernel: audit: type=2000 audit(1765003612.906:1): state=initialized audit_enabled=0 res=1 Dec 6 01:46:54 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Dec 6 01:46:54 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 6 01:46:54 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Dec 6 01:46:54 localhost kernel: cpuidle: using governor menu Dec 6 01:46:54 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Dec 6 01:46:54 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 6 01:46:54 localhost kernel: PCI: Using configuration type 1 for base access Dec 6 01:46:54 localhost kernel: PCI: Using configuration type 1 for extended access Dec 6 01:46:54 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 6 01:46:54 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Dec 6 01:46:54 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Dec 6 01:46:54 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Dec 6 01:46:54 localhost kernel: cryptd: max_cpu_qlen set to 1000 Dec 6 01:46:54 localhost kernel: ACPI: Added _OSI(Module Device) Dec 6 01:46:54 localhost kernel: ACPI: Added _OSI(Processor Device) Dec 6 01:46:54 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Dec 6 01:46:54 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 6 01:46:54 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Dec 6 01:46:54 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Dec 6 01:46:54 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Dec 6 01:46:54 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 6 01:46:54 localhost kernel: ACPI: Interpreter enabled Dec 6 01:46:54 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Dec 6 01:46:54 localhost kernel: ACPI: Using IOAPIC for interrupt routing Dec 6 01:46:54 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 6 01:46:54 localhost kernel: PCI: Using E820 reservations for host bridge windows Dec 6 01:46:54 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Dec 6 01:46:54 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 6 01:46:54 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Dec 6 01:46:54 localhost kernel: acpiphp: Slot [3] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [4] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [5] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [6] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [7] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [8] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [9] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [10] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [11] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [12] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [13] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [14] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [15] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [16] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [17] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [18] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [19] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [20] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [21] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [22] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [23] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [24] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [25] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [26] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [27] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [28] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [29] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [30] registered Dec 6 01:46:54 localhost kernel: acpiphp: Slot [31] registered Dec 6 01:46:54 localhost kernel: PCI host bridge to bus 0000:00 Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 6 01:46:54 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Dec 6 01:46:54 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Dec 6 01:46:54 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Dec 6 01:46:54 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Dec 6 01:46:54 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Dec 6 01:46:54 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Dec 6 01:46:54 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Dec 6 01:46:54 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Dec 6 01:46:54 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Dec 6 01:46:54 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Dec 6 01:46:54 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Dec 6 01:46:54 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Dec 6 01:46:54 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 6 01:46:54 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Dec 6 01:46:54 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Dec 6 01:46:54 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Dec 6 01:46:54 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Dec 6 01:46:54 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Dec 6 01:46:54 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Dec 6 01:46:54 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Dec 6 01:46:54 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Dec 6 01:46:54 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Dec 6 01:46:54 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Dec 6 01:46:54 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Dec 6 01:46:54 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Dec 6 01:46:54 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Dec 6 01:46:54 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Dec 6 01:46:54 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Dec 6 01:46:54 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 6 01:46:54 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 6 01:46:54 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 6 01:46:54 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 6 01:46:54 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Dec 6 01:46:54 localhost kernel: iommu: Default domain type: Translated Dec 6 01:46:54 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 6 01:46:54 localhost kernel: SCSI subsystem initialized Dec 6 01:46:54 localhost kernel: ACPI: bus type USB registered Dec 6 01:46:54 localhost kernel: usbcore: registered new interface driver usbfs Dec 6 01:46:54 localhost kernel: usbcore: registered new interface driver hub Dec 6 01:46:54 localhost kernel: usbcore: registered new device driver usb Dec 6 01:46:54 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Dec 6 01:46:54 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 6 01:46:54 localhost kernel: PTP clock support registered Dec 6 01:46:54 localhost kernel: EDAC MC: Ver: 3.0.0 Dec 6 01:46:54 localhost kernel: NetLabel: Initializing Dec 6 01:46:54 localhost kernel: NetLabel: domain hash size = 128 Dec 6 01:46:54 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Dec 6 01:46:54 localhost kernel: NetLabel: unlabeled traffic allowed by default Dec 6 01:46:54 localhost kernel: PCI: Using ACPI for IRQ routing Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Dec 6 01:46:54 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 6 01:46:54 localhost kernel: vgaarb: loaded Dec 6 01:46:54 localhost kernel: clocksource: Switched to clocksource kvm-clock Dec 6 01:46:54 localhost kernel: VFS: Disk quotas dquot_6.6.0 Dec 6 01:46:54 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 6 01:46:54 localhost kernel: pnp: PnP ACPI init Dec 6 01:46:54 localhost kernel: pnp: PnP ACPI: found 5 devices Dec 6 01:46:54 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 6 01:46:54 localhost kernel: NET: Registered PF_INET protocol family Dec 6 01:46:54 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 6 01:46:54 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 6 01:46:54 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 6 01:46:54 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 6 01:46:54 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Dec 6 01:46:54 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 6 01:46:54 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Dec 6 01:46:54 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 6 01:46:54 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 6 01:46:54 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 6 01:46:54 localhost kernel: NET: Registered PF_XDP protocol family Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Dec 6 01:46:54 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Dec 6 01:46:54 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Dec 6 01:46:54 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Dec 6 01:46:54 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Dec 6 01:46:54 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 29883 usecs Dec 6 01:46:54 localhost kernel: PCI: CLS 0 bytes, default 64 Dec 6 01:46:54 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 6 01:46:54 localhost kernel: Trying to unpack rootfs image as initramfs... Dec 6 01:46:54 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Dec 6 01:46:54 localhost kernel: ACPI: bus type thunderbolt registered Dec 6 01:46:54 localhost kernel: Initialise system trusted keyrings Dec 6 01:46:54 localhost kernel: Key type blacklist registered Dec 6 01:46:54 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Dec 6 01:46:54 localhost kernel: zbud: loaded Dec 6 01:46:54 localhost kernel: integrity: Platform Keyring initialized Dec 6 01:46:54 localhost kernel: NET: Registered PF_ALG protocol family Dec 6 01:46:54 localhost kernel: xor: automatically using best checksumming function avx Dec 6 01:46:54 localhost kernel: Key type asymmetric registered Dec 6 01:46:54 localhost kernel: Asymmetric key parser 'x509' registered Dec 6 01:46:54 localhost kernel: Running certificate verification selftests Dec 6 01:46:54 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Dec 6 01:46:54 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Dec 6 01:46:54 localhost kernel: io scheduler mq-deadline registered Dec 6 01:46:54 localhost kernel: io scheduler kyber registered Dec 6 01:46:54 localhost kernel: io scheduler bfq registered Dec 6 01:46:54 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Dec 6 01:46:54 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Dec 6 01:46:54 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Dec 6 01:46:54 localhost kernel: ACPI: button: Power Button [PWRF] Dec 6 01:46:54 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Dec 6 01:46:54 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Dec 6 01:46:54 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Dec 6 01:46:54 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 6 01:46:54 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 6 01:46:54 localhost kernel: Non-volatile memory driver v1.3 Dec 6 01:46:54 localhost kernel: rdac: device handler registered Dec 6 01:46:54 localhost kernel: hp_sw: device handler registered Dec 6 01:46:54 localhost kernel: emc: device handler registered Dec 6 01:46:54 localhost kernel: alua: device handler registered Dec 6 01:46:54 localhost kernel: libphy: Fixed MDIO Bus: probed Dec 6 01:46:54 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Dec 6 01:46:54 localhost kernel: ehci-pci: EHCI PCI platform driver Dec 6 01:46:54 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Dec 6 01:46:54 localhost kernel: ohci-pci: OHCI PCI platform driver Dec 6 01:46:54 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Dec 6 01:46:54 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Dec 6 01:46:54 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Dec 6 01:46:54 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Dec 6 01:46:54 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Dec 6 01:46:54 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Dec 6 01:46:54 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Dec 6 01:46:54 localhost kernel: usb usb1: Product: UHCI Host Controller Dec 6 01:46:54 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Dec 6 01:46:54 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Dec 6 01:46:54 localhost kernel: hub 1-0:1.0: USB hub found Dec 6 01:46:54 localhost kernel: hub 1-0:1.0: 2 ports detected Dec 6 01:46:54 localhost kernel: usbcore: registered new interface driver usbserial_generic Dec 6 01:46:54 localhost kernel: usbserial: USB Serial support registered for generic Dec 6 01:46:54 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 6 01:46:54 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 6 01:46:54 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 6 01:46:54 localhost kernel: mousedev: PS/2 mouse device common for all mice Dec 6 01:46:54 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Dec 6 01:46:54 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Dec 6 01:46:54 localhost kernel: rtc_cmos 00:04: registered as rtc0 Dec 6 01:46:54 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-06T06:46:53 UTC (1765003613) Dec 6 01:46:54 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Dec 6 01:46:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Dec 6 01:46:54 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Dec 6 01:46:54 localhost kernel: usbcore: registered new interface driver usbhid Dec 6 01:46:54 localhost kernel: usbhid: USB HID core driver Dec 6 01:46:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Dec 6 01:46:54 localhost kernel: drop_monitor: Initializing network drop monitor service Dec 6 01:46:54 localhost kernel: Initializing XFRM netlink socket Dec 6 01:46:54 localhost kernel: NET: Registered PF_INET6 protocol family Dec 6 01:46:54 localhost kernel: Segment Routing with IPv6 Dec 6 01:46:54 localhost kernel: NET: Registered PF_PACKET protocol family Dec 6 01:46:54 localhost kernel: mpls_gso: MPLS GSO support Dec 6 01:46:54 localhost kernel: IPI shorthand broadcast: enabled Dec 6 01:46:54 localhost kernel: AVX2 version of gcm_enc/dec engaged. Dec 6 01:46:54 localhost kernel: AES CTR mode by8 optimization enabled Dec 6 01:46:54 localhost kernel: sched_clock: Marking stable (748224207, 177027073)->(1048165864, -122914584) Dec 6 01:46:54 localhost kernel: registered taskstats version 1 Dec 6 01:46:54 localhost kernel: Loading compiled-in X.509 certificates Dec 6 01:46:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Dec 6 01:46:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Dec 6 01:46:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Dec 6 01:46:54 localhost kernel: zswap: loaded using pool lzo/zbud Dec 6 01:46:54 localhost kernel: page_owner is disabled Dec 6 01:46:54 localhost kernel: Key type big_key registered Dec 6 01:46:54 localhost kernel: Freeing initrd memory: 74232K Dec 6 01:46:54 localhost kernel: Key type encrypted registered Dec 6 01:46:54 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Dec 6 01:46:54 localhost kernel: Loading compiled-in module X.509 certificates Dec 6 01:46:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Dec 6 01:46:54 localhost kernel: ima: Allocated hash algorithm: sha256 Dec 6 01:46:54 localhost kernel: ima: No architecture policies found Dec 6 01:46:54 localhost kernel: evm: Initialising EVM extended attributes: Dec 6 01:46:54 localhost kernel: evm: security.selinux Dec 6 01:46:54 localhost kernel: evm: security.SMACK64 (disabled) Dec 6 01:46:54 localhost kernel: evm: security.SMACK64EXEC (disabled) Dec 6 01:46:54 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Dec 6 01:46:54 localhost kernel: evm: security.SMACK64MMAP (disabled) Dec 6 01:46:54 localhost kernel: evm: security.apparmor (disabled) Dec 6 01:46:54 localhost kernel: evm: security.ima Dec 6 01:46:54 localhost kernel: evm: security.capability Dec 6 01:46:54 localhost kernel: evm: HMAC attrs: 0x1 Dec 6 01:46:54 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Dec 6 01:46:54 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Dec 6 01:46:54 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Dec 6 01:46:54 localhost kernel: usb 1-1: Product: QEMU USB Tablet Dec 6 01:46:54 localhost kernel: usb 1-1: Manufacturer: QEMU Dec 6 01:46:54 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Dec 6 01:46:54 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Dec 6 01:46:54 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Dec 6 01:46:54 localhost kernel: Freeing unused decrypted memory: 2036K Dec 6 01:46:54 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Dec 6 01:46:54 localhost kernel: Write protecting the kernel read-only data: 26624k Dec 6 01:46:54 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Dec 6 01:46:54 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Dec 6 01:46:54 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Dec 6 01:46:54 localhost kernel: Run /init as init process Dec 6 01:46:54 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Dec 6 01:46:54 localhost systemd[1]: Detected virtualization kvm. Dec 6 01:46:54 localhost systemd[1]: Detected architecture x86-64. Dec 6 01:46:54 localhost systemd[1]: Running in initrd. Dec 6 01:46:54 localhost systemd[1]: No hostname configured, using default hostname. Dec 6 01:46:54 localhost systemd[1]: Hostname set to . Dec 6 01:46:54 localhost systemd[1]: Initializing machine ID from VM UUID. Dec 6 01:46:54 localhost systemd[1]: Queued start job for default target Initrd Default Target. Dec 6 01:46:54 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Dec 6 01:46:54 localhost systemd[1]: Reached target Local Encrypted Volumes. Dec 6 01:46:54 localhost systemd[1]: Reached target Initrd /usr File System. Dec 6 01:46:54 localhost systemd[1]: Reached target Local File Systems. Dec 6 01:46:54 localhost systemd[1]: Reached target Path Units. Dec 6 01:46:54 localhost systemd[1]: Reached target Slice Units. Dec 6 01:46:54 localhost systemd[1]: Reached target Swaps. Dec 6 01:46:54 localhost systemd[1]: Reached target Timer Units. Dec 6 01:46:54 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Dec 6 01:46:54 localhost systemd[1]: Listening on Journal Socket (/dev/log). Dec 6 01:46:54 localhost systemd[1]: Listening on Journal Socket. Dec 6 01:46:54 localhost systemd[1]: Listening on udev Control Socket. Dec 6 01:46:54 localhost systemd[1]: Listening on udev Kernel Socket. Dec 6 01:46:54 localhost systemd[1]: Reached target Socket Units. Dec 6 01:46:54 localhost systemd[1]: Starting Create List of Static Device Nodes... Dec 6 01:46:54 localhost systemd[1]: Starting Journal Service... Dec 6 01:46:54 localhost systemd[1]: Starting Load Kernel Modules... Dec 6 01:46:54 localhost systemd[1]: Starting Create System Users... Dec 6 01:46:54 localhost systemd[1]: Starting Setup Virtual Console... Dec 6 01:46:54 localhost systemd[1]: Finished Create List of Static Device Nodes. Dec 6 01:46:54 localhost systemd[1]: Finished Load Kernel Modules. Dec 6 01:46:54 localhost systemd[1]: Starting Apply Kernel Variables... Dec 6 01:46:54 localhost systemd-journald[282]: Journal started Dec 6 01:46:54 localhost systemd-journald[282]: Runtime Journal (/run/log/journal/3134f11da070482e98997eb324eccfc9) is 8.0M, max 314.7M, 306.7M free. Dec 6 01:46:54 localhost systemd-modules-load[283]: Module 'msr' is built in Dec 6 01:46:54 localhost systemd[1]: Started Journal Service. Dec 6 01:46:54 localhost systemd[1]: Finished Setup Virtual Console. Dec 6 01:46:54 localhost systemd[1]: Finished Apply Kernel Variables. Dec 6 01:46:54 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Dec 6 01:46:54 localhost systemd[1]: Starting dracut cmdline hook... Dec 6 01:46:54 localhost systemd-sysusers[284]: Creating group 'sgx' with GID 997. Dec 6 01:46:54 localhost systemd-sysusers[284]: Creating group 'users' with GID 100. Dec 6 01:46:54 localhost systemd-sysusers[284]: Creating group 'dbus' with GID 81. Dec 6 01:46:54 localhost systemd-sysusers[284]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Dec 6 01:46:54 localhost systemd[1]: Finished Create System Users. Dec 6 01:46:54 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Dec 6 01:46:54 localhost systemd[1]: Starting Create Volatile Files and Directories... Dec 6 01:46:54 localhost dracut-cmdline[290]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Dec 6 01:46:54 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Dec 6 01:46:54 localhost dracut-cmdline[290]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Dec 6 01:46:54 localhost systemd[1]: Finished Create Volatile Files and Directories. Dec 6 01:46:54 localhost systemd[1]: Finished dracut cmdline hook. Dec 6 01:46:54 localhost systemd[1]: Starting dracut pre-udev hook... Dec 6 01:46:54 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 6 01:46:54 localhost kernel: device-mapper: uevent: version 1.0.3 Dec 6 01:46:54 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Dec 6 01:46:54 localhost kernel: RPC: Registered named UNIX socket transport module. Dec 6 01:46:54 localhost kernel: RPC: Registered udp transport module. Dec 6 01:46:54 localhost kernel: RPC: Registered tcp transport module. Dec 6 01:46:54 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Dec 6 01:46:54 localhost rpc.statd[406]: Version 2.5.4 starting Dec 6 01:46:54 localhost rpc.statd[406]: Initializing NSM state Dec 6 01:46:54 localhost rpc.idmapd[411]: Setting log level to 0 Dec 6 01:46:54 localhost systemd[1]: Finished dracut pre-udev hook. Dec 6 01:46:54 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Dec 6 01:46:54 localhost systemd-udevd[424]: Using default interface naming scheme 'rhel-9.0'. Dec 6 01:46:54 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Dec 6 01:46:54 localhost systemd[1]: Starting dracut pre-trigger hook... Dec 6 01:46:54 localhost systemd[1]: Finished dracut pre-trigger hook. Dec 6 01:46:54 localhost systemd[1]: Starting Coldplug All udev Devices... Dec 6 01:46:54 localhost systemd[1]: Finished Coldplug All udev Devices. Dec 6 01:46:54 localhost systemd[1]: Reached target System Initialization. Dec 6 01:46:54 localhost systemd[1]: Reached target Basic System. Dec 6 01:46:54 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Dec 6 01:46:54 localhost systemd[1]: Reached target Network. Dec 6 01:46:54 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Dec 6 01:46:54 localhost systemd[1]: Starting dracut initqueue hook... Dec 6 01:46:54 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Dec 6 01:46:54 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 6 01:46:54 localhost kernel: GPT:20971519 != 838860799 Dec 6 01:46:54 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Dec 6 01:46:54 localhost kernel: GPT:20971519 != 838860799 Dec 6 01:46:54 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Dec 6 01:46:54 localhost kernel: vda: vda1 vda2 vda3 vda4 Dec 6 01:46:54 localhost systemd-udevd[448]: Network interface NamePolicy= disabled on kernel command line. Dec 6 01:46:54 localhost kernel: scsi host0: ata_piix Dec 6 01:46:54 localhost kernel: scsi host1: ata_piix Dec 6 01:46:54 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Dec 6 01:46:54 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Dec 6 01:46:54 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Dec 6 01:46:55 localhost systemd[1]: Reached target Initrd Root Device. Dec 6 01:46:55 localhost kernel: ata1: found unknown device (class 0) Dec 6 01:46:55 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Dec 6 01:46:55 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 6 01:46:55 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Dec 6 01:46:55 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Dec 6 01:46:55 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 6 01:46:55 localhost systemd[1]: Finished dracut initqueue hook. Dec 6 01:46:55 localhost systemd[1]: Reached target Preparation for Remote File Systems. Dec 6 01:46:55 localhost systemd[1]: Reached target Remote Encrypted Volumes. Dec 6 01:46:55 localhost systemd[1]: Reached target Remote File Systems. Dec 6 01:46:55 localhost systemd[1]: Starting dracut pre-mount hook... Dec 6 01:46:55 localhost systemd[1]: Finished dracut pre-mount hook. Dec 6 01:46:55 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Dec 6 01:46:55 localhost systemd-fsck[513]: /usr/sbin/fsck.xfs: XFS file system. Dec 6 01:46:55 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Dec 6 01:46:55 localhost systemd[1]: Mounting /sysroot... Dec 6 01:46:55 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Dec 6 01:46:55 localhost kernel: XFS (vda4): Mounting V5 Filesystem Dec 6 01:46:55 localhost kernel: XFS (vda4): Ending clean mount Dec 6 01:46:55 localhost systemd[1]: Mounted /sysroot. Dec 6 01:46:55 localhost systemd[1]: Reached target Initrd Root File System. Dec 6 01:46:55 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Dec 6 01:46:55 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Dec 6 01:46:55 localhost systemd[1]: Reached target Initrd File Systems. Dec 6 01:46:55 localhost systemd[1]: Reached target Initrd Default Target. Dec 6 01:46:55 localhost systemd[1]: Starting dracut mount hook... Dec 6 01:46:55 localhost systemd[1]: Finished dracut mount hook. Dec 6 01:46:55 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Dec 6 01:46:55 localhost rpc.idmapd[411]: exiting on signal 15 Dec 6 01:46:55 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Dec 6 01:46:55 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Dec 6 01:46:55 localhost systemd[1]: Stopped target Network. Dec 6 01:46:55 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Dec 6 01:46:55 localhost systemd[1]: Stopped target Timer Units. Dec 6 01:46:55 localhost systemd[1]: dbus.socket: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Dec 6 01:46:55 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Dec 6 01:46:55 localhost systemd[1]: Stopped target Initrd Default Target. Dec 6 01:46:55 localhost systemd[1]: Stopped target Basic System. Dec 6 01:46:55 localhost systemd[1]: Stopped target Initrd Root Device. Dec 6 01:46:55 localhost systemd[1]: Stopped target Initrd /usr File System. Dec 6 01:46:55 localhost systemd[1]: Stopped target Path Units. Dec 6 01:46:55 localhost systemd[1]: Stopped target Remote File Systems. Dec 6 01:46:55 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Dec 6 01:46:55 localhost systemd[1]: Stopped target Slice Units. Dec 6 01:46:55 localhost systemd[1]: Stopped target Socket Units. Dec 6 01:46:55 localhost systemd[1]: Stopped target System Initialization. Dec 6 01:46:55 localhost systemd[1]: Stopped target Local File Systems. Dec 6 01:46:55 localhost systemd[1]: Stopped target Swaps. Dec 6 01:46:55 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped dracut mount hook. Dec 6 01:46:55 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped dracut pre-mount hook. Dec 6 01:46:55 localhost systemd[1]: Stopped target Local Encrypted Volumes. Dec 6 01:46:55 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Dec 6 01:46:55 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped dracut initqueue hook. Dec 6 01:46:55 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped Apply Kernel Variables. Dec 6 01:46:55 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped Load Kernel Modules. Dec 6 01:46:55 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped Create Volatile Files and Directories. Dec 6 01:46:55 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped Coldplug All udev Devices. Dec 6 01:46:55 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped dracut pre-trigger hook. Dec 6 01:46:55 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Dec 6 01:46:55 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped Setup Virtual Console. Dec 6 01:46:55 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Dec 6 01:46:55 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Dec 6 01:46:55 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Closed udev Control Socket. Dec 6 01:46:55 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Closed udev Kernel Socket. Dec 6 01:46:55 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped dracut pre-udev hook. Dec 6 01:46:55 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 6 01:46:55 localhost systemd[1]: Stopped dracut cmdline hook. Dec 6 01:46:56 localhost systemd[1]: Starting Cleanup udev Database... Dec 6 01:46:56 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Dec 6 01:46:56 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Stopped Create List of Static Device Nodes. Dec 6 01:46:56 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Stopped Create System Users. Dec 6 01:46:56 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Finished Cleanup udev Database. Dec 6 01:46:56 localhost systemd[1]: Reached target Switch Root. Dec 6 01:46:56 localhost systemd[1]: Starting Switch Root... Dec 6 01:46:56 localhost systemd[1]: Switching root. Dec 6 01:46:56 localhost systemd-journald[282]: Journal stopped Dec 6 01:46:56 localhost systemd-journald[282]: Received SIGTERM from PID 1 (systemd). Dec 6 01:46:56 localhost kernel: audit: type=1404 audit(1765003616.104:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Dec 6 01:46:56 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 01:46:56 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 01:46:56 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 01:46:56 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 01:46:56 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 01:46:56 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 01:46:56 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 01:46:56 localhost kernel: audit: type=1403 audit(1765003616.204:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 6 01:46:56 localhost systemd[1]: Successfully loaded SELinux policy in 102.875ms. Dec 6 01:46:56 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.710ms. Dec 6 01:46:56 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Dec 6 01:46:56 localhost systemd[1]: Detected virtualization kvm. Dec 6 01:46:56 localhost systemd[1]: Detected architecture x86-64. Dec 6 01:46:56 localhost systemd-rc-local-generator[583]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 01:46:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 01:46:56 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Stopped Switch Root. Dec 6 01:46:56 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 6 01:46:56 localhost systemd[1]: Created slice Slice /system/getty. Dec 6 01:46:56 localhost systemd[1]: Created slice Slice /system/modprobe. Dec 6 01:46:56 localhost systemd[1]: Created slice Slice /system/serial-getty. Dec 6 01:46:56 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Dec 6 01:46:56 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Dec 6 01:46:56 localhost systemd[1]: Created slice User and Session Slice. Dec 6 01:46:56 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Dec 6 01:46:56 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Dec 6 01:46:56 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Dec 6 01:46:56 localhost systemd[1]: Reached target Local Encrypted Volumes. Dec 6 01:46:56 localhost systemd[1]: Stopped target Switch Root. Dec 6 01:46:56 localhost systemd[1]: Stopped target Initrd File Systems. Dec 6 01:46:56 localhost systemd[1]: Stopped target Initrd Root File System. Dec 6 01:46:56 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Dec 6 01:46:56 localhost systemd[1]: Reached target Path Units. Dec 6 01:46:56 localhost systemd[1]: Reached target rpc_pipefs.target. Dec 6 01:46:56 localhost systemd[1]: Reached target Slice Units. Dec 6 01:46:56 localhost systemd[1]: Reached target Swaps. Dec 6 01:46:56 localhost systemd[1]: Reached target Local Verity Protected Volumes. Dec 6 01:46:56 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Dec 6 01:46:56 localhost systemd[1]: Reached target RPC Port Mapper. Dec 6 01:46:56 localhost systemd[1]: Listening on Process Core Dump Socket. Dec 6 01:46:56 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Dec 6 01:46:56 localhost systemd[1]: Listening on udev Control Socket. Dec 6 01:46:56 localhost systemd[1]: Listening on udev Kernel Socket. Dec 6 01:46:56 localhost systemd[1]: Mounting Huge Pages File System... Dec 6 01:46:56 localhost systemd[1]: Mounting POSIX Message Queue File System... Dec 6 01:46:56 localhost systemd[1]: Mounting Kernel Debug File System... Dec 6 01:46:56 localhost systemd[1]: Mounting Kernel Trace File System... Dec 6 01:46:56 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Dec 6 01:46:56 localhost systemd[1]: Starting Create List of Static Device Nodes... Dec 6 01:46:56 localhost systemd[1]: Starting Load Kernel Module configfs... Dec 6 01:46:56 localhost systemd[1]: Starting Load Kernel Module drm... Dec 6 01:46:56 localhost systemd[1]: Starting Load Kernel Module fuse... Dec 6 01:46:56 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Dec 6 01:46:56 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Stopped File System Check on Root Device. Dec 6 01:46:56 localhost systemd[1]: Stopped Journal Service. Dec 6 01:46:56 localhost systemd[1]: Starting Journal Service... Dec 6 01:46:56 localhost systemd[1]: Starting Load Kernel Modules... Dec 6 01:46:56 localhost systemd[1]: Starting Generate network units from Kernel command line... Dec 6 01:46:56 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Dec 6 01:46:56 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Dec 6 01:46:56 localhost systemd[1]: Starting Coldplug All udev Devices... Dec 6 01:46:56 localhost kernel: fuse: init (API version 7.36) Dec 6 01:46:56 localhost systemd[1]: Mounted Huge Pages File System. Dec 6 01:46:56 localhost systemd[1]: Mounted POSIX Message Queue File System. Dec 6 01:46:56 localhost systemd[1]: Mounted Kernel Debug File System. Dec 6 01:46:56 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Dec 6 01:46:56 localhost systemd-journald[619]: Journal started Dec 6 01:46:56 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 8.0M, max 314.7M, 306.7M free. Dec 6 01:46:56 localhost systemd[1]: Queued start job for default target Multi-User System. Dec 6 01:46:56 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd-modules-load[620]: Module 'msr' is built in Dec 6 01:46:56 localhost systemd[1]: Started Journal Service. Dec 6 01:46:56 localhost systemd[1]: Mounted Kernel Trace File System. Dec 6 01:46:56 localhost systemd[1]: Finished Create List of Static Device Nodes. Dec 6 01:46:56 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Finished Load Kernel Module configfs. Dec 6 01:46:56 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Finished Load Kernel Module fuse. Dec 6 01:46:56 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Dec 6 01:46:56 localhost systemd[1]: Finished Load Kernel Modules. Dec 6 01:46:56 localhost systemd[1]: Finished Generate network units from Kernel command line. Dec 6 01:46:56 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Dec 6 01:46:56 localhost systemd[1]: Mounting FUSE Control File System... Dec 6 01:46:56 localhost systemd[1]: Mounting Kernel Configuration File System... Dec 6 01:46:56 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Dec 6 01:46:56 localhost systemd[1]: Starting Rebuild Hardware Database... Dec 6 01:46:56 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Dec 6 01:46:56 localhost kernel: ACPI: bus type drm_connector registered Dec 6 01:46:56 localhost systemd[1]: Starting Load/Save Random Seed... Dec 6 01:46:56 localhost systemd[1]: Starting Apply Kernel Variables... Dec 6 01:46:56 localhost systemd[1]: Starting Create System Users... Dec 6 01:46:56 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 6 01:46:56 localhost systemd[1]: Finished Load Kernel Module drm. Dec 6 01:46:56 localhost systemd[1]: Mounted FUSE Control File System. Dec 6 01:46:56 localhost systemd[1]: Mounted Kernel Configuration File System. Dec 6 01:46:56 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 8.0M, max 314.7M, 306.7M free. Dec 6 01:46:56 localhost systemd-journald[619]: Received client request to flush runtime journal. Dec 6 01:46:56 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Dec 6 01:46:56 localhost systemd[1]: Finished Load/Save Random Seed. Dec 6 01:46:56 localhost systemd[1]: Finished Apply Kernel Variables. Dec 6 01:46:56 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Dec 6 01:46:56 localhost systemd-sysusers[632]: Creating group 'sgx' with GID 989. Dec 6 01:46:56 localhost systemd-sysusers[632]: Creating group 'systemd-oom' with GID 988. Dec 6 01:46:56 localhost systemd-sysusers[632]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Dec 6 01:46:56 localhost systemd[1]: Finished Coldplug All udev Devices. Dec 6 01:46:56 localhost systemd[1]: Finished Create System Users. Dec 6 01:46:56 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Dec 6 01:46:56 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Dec 6 01:46:56 localhost systemd[1]: Reached target Preparation for Local File Systems. Dec 6 01:46:56 localhost systemd[1]: Set up automount EFI System Partition Automount. Dec 6 01:46:57 localhost systemd[1]: Finished Rebuild Hardware Database. Dec 6 01:46:57 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Dec 6 01:46:57 localhost systemd-udevd[636]: Using default interface naming scheme 'rhel-9.0'. Dec 6 01:46:57 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Dec 6 01:46:57 localhost systemd[1]: Starting Load Kernel Module configfs... Dec 6 01:46:57 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 6 01:46:57 localhost systemd[1]: Finished Load Kernel Module configfs. Dec 6 01:46:57 localhost systemd-udevd[642]: Network interface NamePolicy= disabled on kernel command line. Dec 6 01:46:57 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Dec 6 01:46:57 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Dec 6 01:46:57 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Dec 6 01:46:57 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Dec 6 01:46:57 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Dec 6 01:46:57 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Dec 6 01:46:57 localhost systemd-fsck[679]: fsck.fat 4.2 (2021-01-31) Dec 6 01:46:57 localhost systemd-fsck[679]: /dev/vda2: 12 files, 1782/51145 clusters Dec 6 01:46:57 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Dec 6 01:46:57 localhost kernel: SVM: TSC scaling supported Dec 6 01:46:57 localhost kernel: kvm: Nested Virtualization enabled Dec 6 01:46:57 localhost kernel: SVM: kvm: Nested Paging enabled Dec 6 01:46:57 localhost kernel: SVM: LBR virtualization supported Dec 6 01:46:57 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Dec 6 01:46:57 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Dec 6 01:46:57 localhost kernel: Console: switching to colour dummy device 80x25 Dec 6 01:46:57 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 6 01:46:57 localhost kernel: [drm] features: -context_init Dec 6 01:46:57 localhost kernel: [drm] number of scanouts: 1 Dec 6 01:46:57 localhost kernel: [drm] number of cap sets: 0 Dec 6 01:46:57 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Dec 6 01:46:57 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Dec 6 01:46:57 localhost kernel: Console: switching to colour frame buffer device 128x48 Dec 6 01:46:57 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 6 01:46:57 localhost systemd[1]: Mounting /boot... Dec 6 01:46:57 localhost kernel: XFS (vda3): Mounting V5 Filesystem Dec 6 01:46:57 localhost kernel: XFS (vda3): Ending clean mount Dec 6 01:46:57 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Dec 6 01:46:57 localhost systemd[1]: Mounted /boot. Dec 6 01:46:57 localhost systemd[1]: Mounting /boot/efi... Dec 6 01:46:57 localhost systemd[1]: Mounted /boot/efi. Dec 6 01:46:57 localhost systemd[1]: Reached target Local File Systems. Dec 6 01:46:57 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Dec 6 01:46:57 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Dec 6 01:46:57 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 6 01:46:57 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Dec 6 01:46:57 localhost systemd[1]: Starting Automatic Boot Loader Update... Dec 6 01:46:57 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Dec 6 01:46:57 localhost systemd[1]: Starting Create Volatile Files and Directories... Dec 6 01:46:57 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 717 (bootctl) Dec 6 01:46:57 localhost systemd[1]: Starting File System Check on /dev/vda2... Dec 6 01:46:57 localhost systemd[1]: Finished File System Check on /dev/vda2. Dec 6 01:46:57 localhost systemd[1]: Mounting EFI System Partition Automount... Dec 6 01:46:57 localhost systemd[1]: Mounted EFI System Partition Automount. Dec 6 01:46:57 localhost systemd[1]: Finished Automatic Boot Loader Update. Dec 6 01:46:57 localhost systemd[1]: Finished Create Volatile Files and Directories. Dec 6 01:46:57 localhost systemd[1]: Starting Security Auditing Service... Dec 6 01:46:57 localhost systemd[1]: Starting RPC Bind... Dec 6 01:46:57 localhost systemd[1]: Starting Rebuild Journal Catalog... Dec 6 01:46:57 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Dec 6 01:46:57 localhost auditd[726]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Dec 6 01:46:57 localhost auditd[726]: Init complete, auditd 3.0.7 listening for events (startup state enable) Dec 6 01:46:57 localhost systemd[1]: Finished Rebuild Journal Catalog. Dec 6 01:46:57 localhost systemd[1]: Starting Update is Completed... Dec 6 01:46:57 localhost systemd[1]: Started RPC Bind. Dec 6 01:46:57 localhost systemd[1]: Finished Update is Completed. Dec 6 01:46:57 localhost augenrules[731]: /sbin/augenrules: No change Dec 6 01:46:57 localhost augenrules[742]: No rules Dec 6 01:46:57 localhost augenrules[742]: enabled 1 Dec 6 01:46:57 localhost augenrules[742]: failure 1 Dec 6 01:46:57 localhost augenrules[742]: pid 726 Dec 6 01:46:57 localhost augenrules[742]: rate_limit 0 Dec 6 01:46:57 localhost augenrules[742]: backlog_limit 8192 Dec 6 01:46:57 localhost augenrules[742]: lost 0 Dec 6 01:46:57 localhost augenrules[742]: backlog 0 Dec 6 01:46:57 localhost augenrules[742]: backlog_wait_time 60000 Dec 6 01:46:57 localhost augenrules[742]: backlog_wait_time_actual 0 Dec 6 01:46:57 localhost augenrules[742]: enabled 1 Dec 6 01:46:57 localhost augenrules[742]: failure 1 Dec 6 01:46:57 localhost augenrules[742]: pid 726 Dec 6 01:46:57 localhost augenrules[742]: rate_limit 0 Dec 6 01:46:57 localhost augenrules[742]: backlog_limit 8192 Dec 6 01:46:57 localhost augenrules[742]: lost 0 Dec 6 01:46:57 localhost augenrules[742]: backlog 0 Dec 6 01:46:57 localhost augenrules[742]: backlog_wait_time 60000 Dec 6 01:46:57 localhost augenrules[742]: backlog_wait_time_actual 0 Dec 6 01:46:57 localhost augenrules[742]: enabled 1 Dec 6 01:46:57 localhost augenrules[742]: failure 1 Dec 6 01:46:57 localhost augenrules[742]: pid 726 Dec 6 01:46:57 localhost augenrules[742]: rate_limit 0 Dec 6 01:46:57 localhost augenrules[742]: backlog_limit 8192 Dec 6 01:46:57 localhost augenrules[742]: lost 0 Dec 6 01:46:57 localhost augenrules[742]: backlog 0 Dec 6 01:46:57 localhost augenrules[742]: backlog_wait_time 60000 Dec 6 01:46:57 localhost augenrules[742]: backlog_wait_time_actual 0 Dec 6 01:46:57 localhost systemd[1]: Started Security Auditing Service. Dec 6 01:46:57 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Dec 6 01:46:57 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Dec 6 01:46:57 localhost systemd[1]: Reached target System Initialization. Dec 6 01:46:57 localhost systemd[1]: Started dnf makecache --timer. Dec 6 01:46:57 localhost systemd[1]: Started Daily rotation of log files. Dec 6 01:46:57 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Dec 6 01:46:57 localhost systemd[1]: Reached target Timer Units. Dec 6 01:46:57 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Dec 6 01:46:57 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Dec 6 01:46:57 localhost systemd[1]: Reached target Socket Units. Dec 6 01:46:57 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Dec 6 01:46:57 localhost systemd[1]: Starting D-Bus System Message Bus... Dec 6 01:46:57 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Dec 6 01:46:58 localhost systemd[1]: Started D-Bus System Message Bus. Dec 6 01:46:58 localhost systemd[1]: Reached target Basic System. Dec 6 01:46:58 localhost journal[751]: Ready Dec 6 01:46:58 localhost systemd[1]: Starting NTP client/server... Dec 6 01:46:58 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Dec 6 01:46:58 localhost systemd[1]: Started irqbalance daemon. Dec 6 01:46:58 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Dec 6 01:46:58 localhost systemd[1]: Starting System Logging Service... Dec 6 01:46:58 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 01:46:58 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 01:46:58 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 01:46:58 localhost systemd[1]: Reached target sshd-keygen.target. Dec 6 01:46:58 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Dec 6 01:46:58 localhost systemd[1]: Reached target User and Group Name Lookups. Dec 6 01:46:58 localhost systemd[1]: Starting User Login Management... Dec 6 01:46:58 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Dec 6 01:46:58 localhost rsyslogd[759]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="759" x-info="https://www.rsyslog.com"] start Dec 6 01:46:58 localhost rsyslogd[759]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Dec 6 01:46:58 localhost systemd[1]: Started System Logging Service. Dec 6 01:46:58 localhost chronyd[766]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 6 01:46:58 localhost chronyd[766]: Using right/UTC timezone to obtain leap second data Dec 6 01:46:58 localhost chronyd[766]: Loaded seccomp filter (level 2) Dec 6 01:46:58 localhost systemd[1]: Started NTP client/server. Dec 6 01:46:58 localhost systemd-logind[760]: New seat seat0. Dec 6 01:46:58 localhost systemd-logind[760]: Watching system buttons on /dev/input/event0 (Power Button) Dec 6 01:46:58 localhost systemd-logind[760]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Dec 6 01:46:58 localhost systemd[1]: Started User Login Management. Dec 6 01:46:58 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 01:46:58 localhost cloud-init[770]: Cloud-init v. 22.1-9.el9 running 'init-local' at Sat, 06 Dec 2025 06:46:58 +0000. Up 5.55 seconds. Dec 6 01:46:58 localhost systemd[1]: Starting Hostname Service... Dec 6 01:46:58 localhost systemd[1]: Started Hostname Service. Dec 6 01:46:58 localhost systemd-hostnamed[784]: Hostname set to (static) Dec 6 01:46:58 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Dec 6 01:46:58 localhost systemd[1]: Reached target Preparation for Network. Dec 6 01:46:58 localhost systemd[1]: Starting Network Manager... Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.8990] NetworkManager (version 1.42.2-1.el9) is starting... (boot:4d6ef8f6-4845-4627-91d4-5f87bf4fbd01) Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.8997] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Dec 6 01:46:58 localhost systemd[1]: Started Network Manager. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9035] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Dec 6 01:46:58 localhost systemd[1]: Reached target Network. Dec 6 01:46:58 localhost systemd[1]: Starting Network Manager Wait Online... Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9124] manager[0x561bc7286020]: monitoring kernel firmware directory '/lib/firmware'. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9158] hostname: hostname: using hostnamed Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9159] hostname: static hostname changed from (none) to "np0005548798.novalocal" Dec 6 01:46:58 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9173] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Dec 6 01:46:58 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Dec 6 01:46:58 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 6 01:46:58 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9330] manager[0x561bc7286020]: rfkill: Wi-Fi hardware radio set enabled Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9332] manager[0x561bc7286020]: rfkill: WWAN hardware radio set enabled Dec 6 01:46:58 localhost systemd[1]: Started GSSAPI Proxy Daemon. Dec 6 01:46:58 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Dec 6 01:46:58 localhost systemd[1]: Reached target NFS client services. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9414] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9416] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9422] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9424] manager: Networking is enabled by state file Dec 6 01:46:58 localhost systemd[1]: Reached target Preparation for Remote File Systems. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9460] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9462] settings: Loaded settings plugin: keyfile (internal) Dec 6 01:46:58 localhost systemd[1]: Reached target Remote File Systems. Dec 6 01:46:58 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9508] dhcp: init: Using DHCP client 'internal' Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9513] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9537] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9549] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9563] device (lo): Activation: starting connection 'lo' (019c0ea9-a9f4-4035-878e-b6e30e866cf0) Dec 6 01:46:58 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9580] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9590] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Dec 6 01:46:58 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9659] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9667] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9673] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9679] device (eth0): carrier: link connected Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9685] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9696] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9706] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9715] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9718] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9724] manager: NetworkManager state is now CONNECTING Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9729] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9762] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9768] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Dec 6 01:46:58 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9814] dhcp4 (eth0): state changed new lease, address=38.129.56.147 Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9821] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9856] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9869] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9873] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9881] device (lo): Activation: successful, device activated. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9923] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9927] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9933] manager: NetworkManager state is now CONNECTED_SITE Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9940] device (eth0): Activation: successful, device activated. Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9948] manager: NetworkManager state is now CONNECTED_GLOBAL Dec 6 01:46:58 localhost NetworkManager[789]: [1765003618.9954] manager: startup complete Dec 6 01:46:59 localhost systemd[1]: Finished Network Manager Wait Online. Dec 6 01:46:59 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Dec 6 01:46:59 localhost cloud-init[963]: Cloud-init v. 22.1-9.el9 running 'init' at Sat, 06 Dec 2025 06:46:59 +0000. Up 6.43 seconds. Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | eth0 | True | 38.129.56.147 | 255.255.255.0 | global | fa:16:3e:df:39:8d | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | eth0 | True | fe80::f816:3eff:fedf:398d/64 | . | link | fa:16:3e:df:39:8d | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | lo | True | ::1/128 | . | host | . | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | 0 | 0.0.0.0 | 38.129.56.1 | 0.0.0.0 | eth0 | UG | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | 1 | 38.129.56.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | 2 | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 | eth0 | UGH | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +-------+-------------+---------+-----------+-------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | Route | Destination | Gateway | Interface | Flags | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +-------+-------------+---------+-----------+-------+ Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: | 3 | multicast | :: | eth0 | U | Dec 6 01:46:59 localhost cloud-init[963]: ci-info: +-------+-------------+---------+-----------+-------+ Dec 6 01:46:59 localhost systemd[1]: Starting Authorization Manager... Dec 6 01:46:59 localhost polkitd[1036]: Started polkitd version 0.117 Dec 6 01:46:59 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 6 01:46:59 localhost systemd[1]: Started Authorization Manager. Dec 6 01:47:02 localhost cloud-init[963]: Generating public/private rsa key pair. Dec 6 01:47:02 localhost cloud-init[963]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Dec 6 01:47:02 localhost cloud-init[963]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Dec 6 01:47:02 localhost cloud-init[963]: The key fingerprint is: Dec 6 01:47:02 localhost cloud-init[963]: SHA256:oclgwq2l2+N7y+87bb5uVynxr4u8Zbm00DvqTy58aDI root@np0005548798.novalocal Dec 6 01:47:02 localhost cloud-init[963]: The key's randomart image is: Dec 6 01:47:02 localhost cloud-init[963]: +---[RSA 3072]----+ Dec 6 01:47:02 localhost cloud-init[963]: | | Dec 6 01:47:02 localhost cloud-init[963]: | . . | Dec 6 01:47:02 localhost cloud-init[963]: | o = . | Dec 6 01:47:02 localhost cloud-init[963]: | * o o . . | Dec 6 01:47:02 localhost cloud-init[963]: | o + S o . | Dec 6 01:47:02 localhost cloud-init[963]: | o ..+. | Dec 6 01:47:02 localhost cloud-init[963]: | . o . ..+B. | Dec 6 01:47:02 localhost cloud-init[963]: | . o. . E.*Oo+.| Dec 6 01:47:02 localhost cloud-init[963]: | ooo++B+B**O+ | Dec 6 01:47:02 localhost cloud-init[963]: +----[SHA256]-----+ Dec 6 01:47:02 localhost cloud-init[963]: Generating public/private ecdsa key pair. Dec 6 01:47:02 localhost cloud-init[963]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Dec 6 01:47:02 localhost cloud-init[963]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Dec 6 01:47:02 localhost cloud-init[963]: The key fingerprint is: Dec 6 01:47:02 localhost cloud-init[963]: SHA256:nafSfRQ4rqFJhE9wrP2GANNVYbZ3csncX9l9ox+lL1o root@np0005548798.novalocal Dec 6 01:47:02 localhost cloud-init[963]: The key's randomart image is: Dec 6 01:47:02 localhost cloud-init[963]: +---[ECDSA 256]---+ Dec 6 01:47:02 localhost cloud-init[963]: | ..oo.=. | Dec 6 01:47:02 localhost cloud-init[963]: | o .+.o . + o+| Dec 6 01:47:02 localhost cloud-init[963]: | o.oo . = BoB| Dec 6 01:47:02 localhost cloud-init[963]: | o+.. + =.o*| Dec 6 01:47:02 localhost cloud-init[963]: | .So+ o.o..| Dec 6 01:47:02 localhost cloud-init[963]: | ..+o* ....| Dec 6 01:47:02 localhost cloud-init[963]: | +.+ . E..| Dec 6 01:47:02 localhost cloud-init[963]: | . + . | Dec 6 01:47:02 localhost cloud-init[963]: | . | Dec 6 01:47:02 localhost cloud-init[963]: +----[SHA256]-----+ Dec 6 01:47:02 localhost cloud-init[963]: Generating public/private ed25519 key pair. Dec 6 01:47:02 localhost cloud-init[963]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Dec 6 01:47:02 localhost cloud-init[963]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Dec 6 01:47:02 localhost cloud-init[963]: The key fingerprint is: Dec 6 01:47:02 localhost cloud-init[963]: SHA256:0YhdTw1W8pcG94oRrckNNJ1g4vDelqTpYxnz12+oS3I root@np0005548798.novalocal Dec 6 01:47:02 localhost cloud-init[963]: The key's randomart image is: Dec 6 01:47:02 localhost cloud-init[963]: +--[ED25519 256]--+ Dec 6 01:47:02 localhost cloud-init[963]: | . o.@O.o | Dec 6 01:47:02 localhost cloud-init[963]: | o B *.=B o| Dec 6 01:47:02 localhost cloud-init[963]: | . + +.+=.oo| Dec 6 01:47:02 localhost cloud-init[963]: | o =++oo | Dec 6 01:47:02 localhost cloud-init[963]: | S * = . | Dec 6 01:47:02 localhost cloud-init[963]: | . * . | Dec 6 01:47:02 localhost cloud-init[963]: | * E ...| Dec 6 01:47:02 localhost cloud-init[963]: | . = .. o| Dec 6 01:47:02 localhost cloud-init[963]: | oo ..| Dec 6 01:47:02 localhost cloud-init[963]: +----[SHA256]-----+ Dec 6 01:47:02 localhost sm-notify[1128]: Version 2.5.4 starting Dec 6 01:47:02 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Dec 6 01:47:02 localhost systemd[1]: Reached target Cloud-config availability. Dec 6 01:47:02 localhost systemd[1]: Reached target Network is Online. Dec 6 01:47:02 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Dec 6 01:47:02 localhost sshd[1129]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:02 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Dec 6 01:47:02 localhost systemd[1]: Starting Crash recovery kernel arming... Dec 6 01:47:02 localhost systemd[1]: Starting Notify NFS peers of a restart... Dec 6 01:47:02 localhost systemd[1]: Starting OpenSSH server daemon... Dec 6 01:47:02 localhost systemd[1]: Starting Permit User Sessions... Dec 6 01:47:02 localhost systemd[1]: Started Notify NFS peers of a restart. Dec 6 01:47:02 localhost systemd[1]: Finished Permit User Sessions. Dec 6 01:47:02 localhost systemd[1]: Started Command Scheduler. Dec 6 01:47:02 localhost systemd[1]: Started Getty on tty1. Dec 6 01:47:02 localhost systemd[1]: Started Serial Getty on ttyS0. Dec 6 01:47:02 localhost systemd[1]: Reached target Login Prompts. Dec 6 01:47:02 localhost systemd[1]: Started OpenSSH server daemon. Dec 6 01:47:02 localhost systemd[1]: Reached target Multi-User System. Dec 6 01:47:02 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Dec 6 01:47:02 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Dec 6 01:47:02 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Dec 6 01:47:02 localhost kdumpctl[1132]: kdump: No kdump initial ramdisk found. Dec 6 01:47:02 localhost kdumpctl[1132]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Dec 6 01:47:02 localhost cloud-init[1254]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Sat, 06 Dec 2025 06:47:02 +0000. Up 9.56 seconds. Dec 6 01:47:02 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Dec 6 01:47:02 localhost systemd[1]: Starting Execute cloud user/final scripts... Dec 6 01:47:02 localhost dracut[1414]: dracut-057-21.git20230214.el9 Dec 6 01:47:02 localhost cloud-init[1432]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Sat, 06 Dec 2025 06:47:02 +0000. Up 9.93 seconds. Dec 6 01:47:02 localhost dracut[1416]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Dec 6 01:47:02 localhost cloud-init[1446]: ############################################################# Dec 6 01:47:02 localhost cloud-init[1449]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Dec 6 01:47:02 localhost cloud-init[1457]: 256 SHA256:nafSfRQ4rqFJhE9wrP2GANNVYbZ3csncX9l9ox+lL1o root@np0005548798.novalocal (ECDSA) Dec 6 01:47:02 localhost cloud-init[1462]: 256 SHA256:0YhdTw1W8pcG94oRrckNNJ1g4vDelqTpYxnz12+oS3I root@np0005548798.novalocal (ED25519) Dec 6 01:47:02 localhost cloud-init[1471]: 3072 SHA256:oclgwq2l2+N7y+87bb5uVynxr4u8Zbm00DvqTy58aDI root@np0005548798.novalocal (RSA) Dec 6 01:47:02 localhost cloud-init[1474]: -----END SSH HOST KEY FINGERPRINTS----- Dec 6 01:47:02 localhost cloud-init[1476]: ############################################################# Dec 6 01:47:02 localhost cloud-init[1432]: Cloud-init v. 22.1-9.el9 finished at Sat, 06 Dec 2025 06:47:02 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 10.18 seconds Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Dec 6 01:47:03 localhost systemd[1]: Reloading Network Manager... Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Dec 6 01:47:03 localhost NetworkManager[789]: [1765003623.1067] audit: op="reload" arg="0" pid=1575 uid=0 result="success" Dec 6 01:47:03 localhost NetworkManager[789]: [1765003623.1076] config: signal: SIGHUP (no changes from disk) Dec 6 01:47:03 localhost dracut[1416]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Dec 6 01:47:03 localhost systemd[1]: Reloaded Network Manager. Dec 6 01:47:03 localhost systemd[1]: Finished Execute cloud user/final scripts. Dec 6 01:47:03 localhost dracut[1416]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Dec 6 01:47:03 localhost systemd[1]: Reached target Cloud-init target. Dec 6 01:47:03 localhost dracut[1416]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Dec 6 01:47:03 localhost dracut[1416]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Dec 6 01:47:03 localhost dracut[1416]: memstrack is not available Dec 6 01:47:03 localhost dracut[1416]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Dec 6 01:47:03 localhost dracut[1416]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Dec 6 01:47:03 localhost dracut[1416]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Dec 6 01:47:03 localhost dracut[1416]: memstrack is not available Dec 6 01:47:03 localhost dracut[1416]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Dec 6 01:47:03 localhost chronyd[766]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org) Dec 6 01:47:03 localhost chronyd[766]: System clock TAI offset set to 37 seconds Dec 6 01:47:04 localhost dracut[1416]: *** Including module: systemd *** Dec 6 01:47:04 localhost dracut[1416]: *** Including module: systemd-initrd *** Dec 6 01:47:04 localhost dracut[1416]: *** Including module: i18n *** Dec 6 01:47:04 localhost dracut[1416]: No KEYMAP configured. Dec 6 01:47:04 localhost dracut[1416]: *** Including module: drm *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: prefixdevname *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: kernel-modules *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: kernel-modules-extra *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: qemu *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: fstab-sys *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: rootfs-block *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: terminfo *** Dec 6 01:47:05 localhost dracut[1416]: *** Including module: udev-rules *** Dec 6 01:47:06 localhost dracut[1416]: Skipping udev rule: 91-permissions.rules Dec 6 01:47:06 localhost dracut[1416]: Skipping udev rule: 80-drivers-modprobe.rules Dec 6 01:47:06 localhost dracut[1416]: *** Including module: virtiofs *** Dec 6 01:47:06 localhost dracut[1416]: *** Including module: dracut-systemd *** Dec 6 01:47:06 localhost dracut[1416]: *** Including module: usrmount *** Dec 6 01:47:06 localhost dracut[1416]: *** Including module: base *** Dec 6 01:47:06 localhost dracut[1416]: *** Including module: fs-lib *** Dec 6 01:47:06 localhost dracut[1416]: *** Including module: kdumpbase *** Dec 6 01:47:06 localhost dracut[1416]: *** Including module: microcode_ctl-fw_dir_override *** Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl module: mangling fw_dir Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: configuration "intel" is ignored Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: configuration "intel-06-2d-07" is ignored Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: configuration "intel-06-4e-03" is ignored Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: configuration "intel-06-4f-01" is ignored Dec 6 01:47:06 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: configuration "intel-06-55-04" is ignored Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: configuration "intel-06-5e-03" is ignored Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: configuration "intel-06-8c-01" is ignored Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Dec 6 01:47:07 localhost dracut[1416]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Dec 6 01:47:07 localhost dracut[1416]: *** Including module: shutdown *** Dec 6 01:47:07 localhost dracut[1416]: *** Including module: squash *** Dec 6 01:47:07 localhost dracut[1416]: *** Including modules done *** Dec 6 01:47:07 localhost dracut[1416]: *** Installing kernel module dependencies *** Dec 6 01:47:07 localhost sshd[3042]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost sshd[3044]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost sshd[3046]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost sshd[3048]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost sshd[3050]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost sshd[3052]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost sshd[3054]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost sshd[3056]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:07 localhost dracut[1416]: *** Installing kernel module dependencies done *** Dec 6 01:47:07 localhost dracut[1416]: *** Resolving executable dependencies *** Dec 6 01:47:07 localhost sshd[3062]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:47:09 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 6 01:47:09 localhost dracut[1416]: *** Resolving executable dependencies done *** Dec 6 01:47:09 localhost dracut[1416]: *** Hardlinking files *** Dec 6 01:47:09 localhost dracut[1416]: Mode: real Dec 6 01:47:09 localhost dracut[1416]: Files: 1099 Dec 6 01:47:09 localhost dracut[1416]: Linked: 3 files Dec 6 01:47:09 localhost dracut[1416]: Compared: 0 xattrs Dec 6 01:47:09 localhost dracut[1416]: Compared: 373 files Dec 6 01:47:09 localhost dracut[1416]: Saved: 61.04 KiB Dec 6 01:47:09 localhost dracut[1416]: Duration: 0.024538 seconds Dec 6 01:47:09 localhost dracut[1416]: *** Hardlinking files done *** Dec 6 01:47:09 localhost dracut[1416]: Could not find 'strip'. Not stripping the initramfs. Dec 6 01:47:09 localhost dracut[1416]: *** Generating early-microcode cpio image *** Dec 6 01:47:09 localhost dracut[1416]: *** Constructing AuthenticAMD.bin *** Dec 6 01:47:09 localhost dracut[1416]: *** Store current command line parameters *** Dec 6 01:47:09 localhost dracut[1416]: Stored kernel commandline: Dec 6 01:47:09 localhost dracut[1416]: No dracut internal kernel commandline stored in the initramfs Dec 6 01:47:09 localhost dracut[1416]: *** Install squash loader *** Dec 6 01:47:10 localhost dracut[1416]: *** Squashing the files inside the initramfs *** Dec 6 01:47:11 localhost dracut[1416]: *** Squashing the files inside the initramfs done *** Dec 6 01:47:11 localhost dracut[1416]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Dec 6 01:47:11 localhost dracut[1416]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Dec 6 01:47:11 localhost kdumpctl[1132]: kdump: kexec: loaded kdump kernel Dec 6 01:47:11 localhost kdumpctl[1132]: kdump: Starting kdump: [OK] Dec 6 01:47:11 localhost systemd[1]: Finished Crash recovery kernel arming. Dec 6 01:47:11 localhost systemd[1]: Startup finished in 1.228s (kernel) + 2.085s (initrd) + 15.658s (userspace) = 18.972s. Dec 6 01:47:28 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 6 01:48:10 localhost chronyd[766]: Selected source 174.142.148.226 (2.rhel.pool.ntp.org) Dec 6 01:48:17 localhost sshd[4174]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:48:17 localhost systemd[1]: Created slice User Slice of UID 1000. Dec 6 01:48:17 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Dec 6 01:48:17 localhost systemd-logind[760]: New session 1 of user zuul. Dec 6 01:48:17 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Dec 6 01:48:17 localhost systemd[1]: Starting User Manager for UID 1000... Dec 6 01:48:17 localhost systemd[4178]: Queued start job for default target Main User Target. Dec 6 01:48:17 localhost systemd[4178]: Created slice User Application Slice. Dec 6 01:48:17 localhost systemd[4178]: Started Mark boot as successful after the user session has run 2 minutes. Dec 6 01:48:17 localhost systemd[4178]: Started Daily Cleanup of User's Temporary Directories. Dec 6 01:48:17 localhost systemd[4178]: Reached target Paths. Dec 6 01:48:17 localhost systemd[4178]: Reached target Timers. Dec 6 01:48:17 localhost systemd[4178]: Starting D-Bus User Message Bus Socket... Dec 6 01:48:17 localhost systemd[4178]: Starting Create User's Volatile Files and Directories... Dec 6 01:48:17 localhost systemd[4178]: Finished Create User's Volatile Files and Directories. Dec 6 01:48:17 localhost systemd[4178]: Listening on D-Bus User Message Bus Socket. Dec 6 01:48:17 localhost systemd[4178]: Reached target Sockets. Dec 6 01:48:17 localhost systemd[4178]: Reached target Basic System. Dec 6 01:48:17 localhost systemd[4178]: Reached target Main User Target. Dec 6 01:48:17 localhost systemd[4178]: Startup finished in 109ms. Dec 6 01:48:17 localhost systemd[1]: Started User Manager for UID 1000. Dec 6 01:48:17 localhost systemd[1]: Started Session 1 of User zuul. Dec 6 01:48:18 localhost python3[4231]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 01:48:26 localhost python3[4249]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 01:48:34 localhost python3[4302]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 01:48:35 localhost python3[4332]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Dec 6 01:48:38 localhost python3[4348]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvWmMz5w6QtZ34EwGejcZ8GA5D9qCDnrR9SN9NoDrchlEWc8l+jVWPmOw3cMQtAEUjYup19NA50t+nf7/BZclio4f2BWDnEZCtg6L+G7p0gN+kFt53NX/JM495f2BE9WTFg+Tti5UXaDpxPu9rJ0+MOl34hDM+GWIahiZbHyI4eaP4ionlBXJHg35XRAYG5gL9aBZy0OxIhecrDjU0zQoZVs7s2TlRr4q+ZnbL8L2yXdNXyEkCDTXB32zGOfVy5uT0zpIsT5FcmTvV3pWj5I1XZowQzhImQdXBvWu652LBAxZwewm9caDh0RtZnfC4yhfFhg3FiENWqnZ7pYX8GU/GZAKhrc7uLLcDyhL8LlYPjpZdiL+F0l/8zbzcD1AcVU2nKMl5ld6GEo8lxnb/nkJtT5dqX6l/C/kxSZfAl8cuc5IrmE4Sl52bhqf5rgn8/g83JyK6k4mtTVrrMMiF/hYmIqgjEAXu3frZHLPMWqYZH5Oujlrl1nEhjnMwSdCPtgs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:48:39 localhost python3[4362]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:40 localhost python3[4421]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:48:41 localhost python3[4462]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765003720.5056114-387-26148128854688/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=2b5c0700ed3142a5a7a0fdfa0b31b16d_id_rsa follow=False checksum=b1020d9647c4828469ac2d677aaf7236891e262d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:42 localhost python3[4535]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:48:42 localhost python3[4576]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765003722.1559832-487-61992483607485/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=2b5c0700ed3142a5a7a0fdfa0b31b16d_id_rsa.pub follow=False checksum=da77701aa35ae4889a68f92ec70a04195da6965c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:44 localhost python3[4604]: ansible-ping Invoked with data=pong Dec 6 01:48:46 localhost python3[4618]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 01:48:52 localhost python3[4671]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Dec 6 01:48:56 localhost python3[4693]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:56 localhost python3[4707]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:56 localhost python3[4721]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:57 localhost python3[4735]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:58 localhost python3[4749]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:48:58 localhost python3[4763]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:01 localhost python3[4780]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:03 localhost python3[4828]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:49:03 localhost python3[4871]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765003742.7627428-99-177372889630704/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:10 localhost python3[4899]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:11 localhost python3[4913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:11 localhost python3[4927]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:11 localhost python3[4941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:11 localhost python3[4955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:12 localhost python3[4969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:12 localhost python3[4983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:12 localhost python3[4997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:12 localhost python3[5011]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:13 localhost python3[5025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:13 localhost python3[5039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:13 localhost python3[5053]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:14 localhost python3[5067]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:14 localhost python3[5081]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:14 localhost python3[5095]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:14 localhost python3[5109]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:15 localhost python3[5123]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:15 localhost python3[5137]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:15 localhost python3[5151]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:15 localhost python3[5165]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:16 localhost python3[5179]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:16 localhost python3[5193]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:16 localhost python3[5207]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:16 localhost python3[5221]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:17 localhost python3[5235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:17 localhost python3[5249]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 01:49:19 localhost python3[5265]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Dec 6 01:49:19 localhost systemd[1]: Starting Time & Date Service... Dec 6 01:49:19 localhost systemd[1]: Started Time & Date Service. Dec 6 01:49:19 localhost systemd-timedated[5267]: Changed time zone to 'UTC' (UTC). Dec 6 01:49:21 localhost python3[5286]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:22 localhost python3[5332]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:49:23 localhost python3[5373]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765003762.6343188-491-203800401241807/source _original_basename=tmp9oaly7_w follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:24 localhost python3[5433]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:49:24 localhost python3[5474]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765003764.2712057-584-254560535456687/source _original_basename=tmp8obwvgwp follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:26 localhost python3[5536]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:49:27 localhost python3[5579]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765003766.470293-729-156286814825995/source _original_basename=tmp06124sbg follow=False checksum=6f93589aa3ed32f31402be6d6bf5fb8cacfc9af1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:28 localhost python3[5607]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 01:49:28 localhost python3[5623]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 01:49:29 localhost python3[5673]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:49:30 localhost python3[5716]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765003769.3818867-852-206888271383686/source _original_basename=tmpw5p3rje5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:49:31 localhost python3[5747]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-1b88-6924-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 01:49:42 localhost python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-1b88-6924-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Dec 6 01:49:49 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Dec 6 01:49:54 localhost python3[5786]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:50:15 localhost python3[5802]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:51:08 localhost systemd[4178]: Starting Mark boot as successful... Dec 6 01:51:08 localhost systemd[4178]: Finished Mark boot as successful. Dec 6 01:51:15 localhost systemd-logind[760]: Session 1 logged out. Waiting for processes to exit. Dec 6 01:51:39 localhost systemd[1]: Unmounting EFI System Partition Automount... Dec 6 01:51:39 localhost systemd[1]: efi.mount: Deactivated successfully. Dec 6 01:51:39 localhost systemd[1]: Unmounted EFI System Partition Automount. Dec 6 01:53:33 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Dec 6 01:53:33 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Dec 6 01:53:33 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Dec 6 01:53:33 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Dec 6 01:53:33 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Dec 6 01:53:34 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Dec 6 01:53:34 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Dec 6 01:53:34 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Dec 6 01:53:34 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Dec 6 01:53:34 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0472] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Dec 6 01:53:34 localhost systemd-udevd[5810]: Network interface NamePolicy= disabled on kernel command line. Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0608] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Dec 6 01:53:34 localhost systemd[4178]: Created slice User Background Tasks Slice. Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0647] settings: (eth1): created default wired connection 'Wired connection 1' Dec 6 01:53:34 localhost systemd[4178]: Starting Cleanup of User's Temporary Files and Directories... Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0656] device (eth1): carrier: link connected Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0662] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0673] policy: auto-activating connection 'Wired connection 1' (307510cc-f65b-3ef8-94bd-84af2d7fec31) Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0681] device (eth1): Activation: starting connection 'Wired connection 1' (307510cc-f65b-3ef8-94bd-84af2d7fec31) Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0684] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0694] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0705] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Dec 6 01:53:34 localhost NetworkManager[789]: [1765004014.0712] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Dec 6 01:53:34 localhost systemd[4178]: Finished Cleanup of User's Temporary Files and Directories. Dec 6 01:53:35 localhost sshd[5813]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:53:35 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Dec 6 01:53:35 localhost systemd-logind[760]: New session 3 of user zuul. Dec 6 01:53:35 localhost systemd[1]: Started Session 3 of User zuul. Dec 6 01:53:35 localhost python3[5830]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-3690-7837-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 01:53:48 localhost python3[5880]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:53:49 localhost python3[5923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765004028.5851362-435-105218758081350/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=cd5c733886ebfd173ce3ec0c84532073824695a8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:53:49 localhost python3[5953]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 01:53:49 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Dec 6 01:53:49 localhost systemd[1]: Stopped Network Manager Wait Online. Dec 6 01:53:49 localhost systemd[1]: Stopping Network Manager Wait Online... Dec 6 01:53:49 localhost systemd[1]: Stopping Network Manager... Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9049] caught SIGTERM, shutting down normally. Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9139] dhcp4 (eth0): canceled DHCP transaction Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9139] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9140] dhcp4 (eth0): state changed no lease Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9144] manager: NetworkManager state is now CONNECTING Dec 6 01:53:49 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9279] dhcp4 (eth1): canceled DHCP transaction Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9279] dhcp4 (eth1): state changed no lease Dec 6 01:53:49 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 6 01:53:49 localhost NetworkManager[789]: [1765004029.9362] exiting (success) Dec 6 01:53:49 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Dec 6 01:53:49 localhost systemd[1]: Stopped Network Manager. Dec 6 01:53:49 localhost systemd[1]: NetworkManager.service: Consumed 2.048s CPU time. Dec 6 01:53:49 localhost systemd[1]: Starting Network Manager... Dec 6 01:53:49 localhost NetworkManager[5965]: [1765004029.9837] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:4d6ef8f6-4845-4627-91d4-5f87bf4fbd01) Dec 6 01:53:49 localhost NetworkManager[5965]: [1765004029.9840] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Dec 6 01:53:49 localhost systemd[1]: Started Network Manager. Dec 6 01:53:49 localhost NetworkManager[5965]: [1765004029.9865] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Dec 6 01:53:49 localhost systemd[1]: Starting Network Manager Wait Online... Dec 6 01:53:49 localhost NetworkManager[5965]: [1765004029.9920] manager[0x56434f9ec090]: monitoring kernel firmware directory '/lib/firmware'. Dec 6 01:53:50 localhost systemd[1]: Starting Hostname Service... Dec 6 01:53:50 localhost systemd[1]: Started Hostname Service. Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0671] hostname: hostname: using hostnamed Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0672] hostname: static hostname changed from (none) to "np0005548798.novalocal" Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0678] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0685] manager[0x56434f9ec090]: rfkill: Wi-Fi hardware radio set enabled Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0685] manager[0x56434f9ec090]: rfkill: WWAN hardware radio set enabled Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0725] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0725] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0726] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0727] manager: Networking is enabled by state file Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0735] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0735] settings: Loaded settings plugin: keyfile (internal) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0778] dhcp: init: Using DHCP client 'internal' Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0782] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0790] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0797] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0809] device (lo): Activation: starting connection 'lo' (019c0ea9-a9f4-4035-878e-b6e30e866cf0) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0817] device (eth0): carrier: link connected Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0823] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0830] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0830] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0838] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0854] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0861] device (eth1): carrier: link connected Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0866] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0875] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (307510cc-f65b-3ef8-94bd-84af2d7fec31) (indicated) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0875] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0882] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0891] device (eth1): Activation: starting connection 'Wired connection 1' (307510cc-f65b-3ef8-94bd-84af2d7fec31) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0917] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0921] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0923] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0927] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0932] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0934] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0936] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0939] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0947] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0951] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0966] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.0969] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1015] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1021] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1028] device (lo): Activation: successful, device activated. Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1037] dhcp4 (eth0): state changed new lease, address=38.129.56.147 Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1044] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1141] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1183] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1185] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1191] manager: NetworkManager state is now CONNECTED_SITE Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1195] device (eth0): Activation: successful, device activated. Dec 6 01:53:50 localhost NetworkManager[5965]: [1765004030.1201] manager: NetworkManager state is now CONNECTED_GLOBAL Dec 6 01:53:50 localhost python3[6035]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-3690-7837-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 01:54:00 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 6 01:54:20 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 6 01:54:35 localhost NetworkManager[5965]: [1765004075.7908] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Dec 6 01:54:35 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 6 01:54:35 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 6 01:54:35 localhost NetworkManager[5965]: [1765004075.8138] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Dec 6 01:54:35 localhost NetworkManager[5965]: [1765004075.8143] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Dec 6 01:54:35 localhost NetworkManager[5965]: [1765004075.8156] device (eth1): Activation: successful, device activated. Dec 6 01:54:35 localhost NetworkManager[5965]: [1765004075.8164] manager: startup complete Dec 6 01:54:35 localhost systemd[1]: Finished Network Manager Wait Online. Dec 6 01:54:45 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 6 01:54:50 localhost systemd[1]: session-3.scope: Deactivated successfully. Dec 6 01:54:50 localhost systemd[1]: session-3.scope: Consumed 1.480s CPU time. Dec 6 01:54:50 localhost systemd-logind[760]: Session 3 logged out. Waiting for processes to exit. Dec 6 01:54:50 localhost systemd-logind[760]: Removed session 3. Dec 6 01:56:01 localhost sshd[6054]: main: sshd: ssh-rsa algorithm is disabled Dec 6 01:56:01 localhost systemd-logind[760]: New session 4 of user zuul. Dec 6 01:56:01 localhost systemd[1]: Started Session 4 of User zuul. Dec 6 01:56:01 localhost python3[6105]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 01:56:02 localhost python3[6148]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765004161.668301-628-65301494023683/source _original_basename=tmph5_hgkg0 follow=False checksum=e422545e3c18550d25d235a51d3c182ca5ac2620 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 01:56:07 localhost systemd[1]: session-4.scope: Deactivated successfully. Dec 6 01:56:07 localhost systemd-logind[760]: Session 4 logged out. Waiting for processes to exit. Dec 6 01:56:07 localhost systemd-logind[760]: Removed session 4. Dec 6 01:57:53 localhost chronyd[766]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org) Dec 6 02:01:16 localhost sshd[6180]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:01:16 localhost systemd-logind[760]: New session 5 of user zuul. Dec 6 02:01:16 localhost systemd[1]: Started Session 5 of User zuul. Dec 6 02:01:16 localhost python3[6199]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-1e44-976c-000000001cf6-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:01:28 localhost python3[6219]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:01:28 localhost python3[6235]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:01:29 localhost python3[6251]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:01:29 localhost python3[6267]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:01:30 localhost python3[6283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:01:31 localhost python3[6331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:01:31 localhost python3[6374]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765004490.905727-631-43746393139219/source _original_basename=tmpa6i0uugh follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:01:33 localhost python3[6405]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 02:01:33 localhost systemd[1]: Reloading. Dec 6 02:01:33 localhost systemd-rc-local-generator[6422]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 02:01:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 02:01:34 localhost python3[6451]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Dec 6 02:01:36 localhost python3[6467]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:01:36 localhost python3[6485]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:01:36 localhost python3[6503]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:01:36 localhost python3[6521]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:01:37 localhost python3[6538]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-1e44-976c-000000001cfd-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:01:48 localhost python3[6558]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 02:01:51 localhost systemd[1]: session-5.scope: Deactivated successfully. Dec 6 02:01:51 localhost systemd[1]: session-5.scope: Consumed 4.154s CPU time. Dec 6 02:01:51 localhost systemd-logind[760]: Session 5 logged out. Waiting for processes to exit. Dec 6 02:01:51 localhost systemd-logind[760]: Removed session 5. Dec 6 02:02:08 localhost systemd[1]: Starting Cleanup of Temporary Directories... Dec 6 02:02:08 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Dec 6 02:02:08 localhost systemd[1]: Finished Cleanup of Temporary Directories. Dec 6 02:02:08 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Dec 6 02:03:23 localhost sshd[6567]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:03:23 localhost systemd-logind[760]: New session 6 of user zuul. Dec 6 02:03:23 localhost systemd[1]: Started Session 6 of User zuul. Dec 6 02:03:24 localhost systemd[1]: Starting RHSM dbus service... Dec 6 02:03:24 localhost systemd[1]: Started RHSM dbus service. Dec 6 02:03:24 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Dec 6 02:03:24 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Dec 6 02:03:24 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Dec 6 02:03:24 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Dec 6 02:03:25 localhost rhsm-service[6591]: INFO [subscription_manager.managerlib:90] Consumer created: np0005548798.novalocal (13ad661d-6c74-404f-ae81-2b24cdeb8ca4) Dec 6 02:03:25 localhost subscription-manager[6591]: Registered system with identity: 13ad661d-6c74-404f-ae81-2b24cdeb8ca4 Dec 6 02:03:26 localhost rhsm-service[6591]: INFO [subscription_manager.entcertlib:131] certs updated: Dec 6 02:03:26 localhost rhsm-service[6591]: Total updates: 1 Dec 6 02:03:26 localhost rhsm-service[6591]: Found (local) serial# [] Dec 6 02:03:26 localhost rhsm-service[6591]: Expected (UEP) serial# [5721368062254683186] Dec 6 02:03:26 localhost rhsm-service[6591]: Added (new) Dec 6 02:03:26 localhost rhsm-service[6591]: [sn:5721368062254683186 ( Content Access,) @ /etc/pki/entitlement/5721368062254683186.pem] Dec 6 02:03:26 localhost rhsm-service[6591]: Deleted (rogue): Dec 6 02:03:26 localhost rhsm-service[6591]: Dec 6 02:03:26 localhost subscription-manager[6591]: Added subscription for 'Content Access' contract 'None' Dec 6 02:03:26 localhost subscription-manager[6591]: Added subscription for product ' Content Access' Dec 6 02:03:27 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Dec 6 02:03:27 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Dec 6 02:03:27 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:03:27 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:03:27 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:03:28 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:03:28 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:03:35 localhost python3[6682]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163efc-24cc-2d47-6d4b-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:03:46 localhost python3[6701]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 02:04:18 localhost setsebool[6776]: The virt_use_nfs policy boolean was changed to 1 by root Dec 6 02:04:18 localhost setsebool[6776]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Dec 6 02:04:26 localhost kernel: SELinux: Converting 407 SID table entries... Dec 6 02:04:26 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 02:04:26 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 02:04:26 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 02:04:26 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 02:04:26 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 02:04:26 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 02:04:26 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 02:04:39 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=3 res=1 Dec 6 02:04:39 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 02:04:39 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 02:04:39 localhost systemd[1]: Reloading. Dec 6 02:04:39 localhost systemd-rc-local-generator[7624]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 02:04:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 02:04:40 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 02:04:41 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:04:47 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 02:04:47 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 02:04:47 localhost systemd[1]: man-db-cache-update.service: Consumed 9.938s CPU time. Dec 6 02:04:47 localhost systemd[1]: run-r8deda8c374004099b924223f3bbe4c4f.service: Deactivated successfully. Dec 6 02:05:32 localhost podman[18368]: 2025-12-06 07:05:32.952625947 +0000 UTC m=+0.128266473 system refresh Dec 6 02:05:33 localhost systemd[4178]: Starting D-Bus User Message Bus... Dec 6 02:05:33 localhost dbus-broker-launch[18427]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Dec 6 02:05:33 localhost dbus-broker-launch[18427]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Dec 6 02:05:33 localhost systemd[4178]: Started D-Bus User Message Bus. Dec 6 02:05:33 localhost journal[18427]: Ready Dec 6 02:05:33 localhost systemd[4178]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Dec 6 02:05:33 localhost systemd[4178]: Created slice Slice /user. Dec 6 02:05:33 localhost systemd[4178]: podman-18410.scope: unit configures an IP firewall, but not running as root. Dec 6 02:05:33 localhost systemd[4178]: (This warning is only shown for the first unit using IP firewalling.) Dec 6 02:05:33 localhost systemd[4178]: Started podman-18410.scope. Dec 6 02:05:33 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 02:05:34 localhost systemd[4178]: Started podman-pause-d4cd7890.scope. Dec 6 02:05:35 localhost systemd[1]: session-6.scope: Deactivated successfully. Dec 6 02:05:35 localhost systemd[1]: session-6.scope: Consumed 50.246s CPU time. Dec 6 02:05:35 localhost systemd-logind[760]: Session 6 logged out. Waiting for processes to exit. Dec 6 02:05:35 localhost systemd-logind[760]: Removed session 6. Dec 6 02:05:51 localhost sshd[18432]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:05:51 localhost sshd[18430]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:05:51 localhost sshd[18431]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:05:51 localhost sshd[18434]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:05:51 localhost sshd[18433]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:05:57 localhost sshd[18440]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:05:57 localhost systemd-logind[760]: New session 7 of user zuul. Dec 6 02:05:57 localhost systemd[1]: Started Session 7 of User zuul. Dec 6 02:05:57 localhost python3[18457]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG6AXPPkdcAE02Rc9uHK/pek2FGedaMHvZIwDGa3LQH2K3PragrqxagLYn8fKLuDDD6UPAK4T5Oby5dg/Uecjgo= zuul@np0005548791.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 02:05:58 localhost python3[18473]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG6AXPPkdcAE02Rc9uHK/pek2FGedaMHvZIwDGa3LQH2K3PragrqxagLYn8fKLuDDD6UPAK4T5Oby5dg/Uecjgo= zuul@np0005548791.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 02:06:00 localhost systemd[1]: session-7.scope: Deactivated successfully. Dec 6 02:06:00 localhost systemd-logind[760]: Session 7 logged out. Waiting for processes to exit. Dec 6 02:06:00 localhost systemd-logind[760]: Removed session 7. Dec 6 02:07:55 localhost sshd[18476]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:07:55 localhost systemd-logind[760]: New session 8 of user zuul. Dec 6 02:07:55 localhost systemd[1]: Started Session 8 of User zuul. Dec 6 02:07:56 localhost python3[18495]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvWmMz5w6QtZ34EwGejcZ8GA5D9qCDnrR9SN9NoDrchlEWc8l+jVWPmOw3cMQtAEUjYup19NA50t+nf7/BZclio4f2BWDnEZCtg6L+G7p0gN+kFt53NX/JM495f2BE9WTFg+Tti5UXaDpxPu9rJ0+MOl34hDM+GWIahiZbHyI4eaP4ionlBXJHg35XRAYG5gL9aBZy0OxIhecrDjU0zQoZVs7s2TlRr4q+ZnbL8L2yXdNXyEkCDTXB32zGOfVy5uT0zpIsT5FcmTvV3pWj5I1XZowQzhImQdXBvWu652LBAxZwewm9caDh0RtZnfC4yhfFhg3FiENWqnZ7pYX8GU/GZAKhrc7uLLcDyhL8LlYPjpZdiL+F0l/8zbzcD1AcVU2nKMl5ld6GEo8lxnb/nkJtT5dqX6l/C/kxSZfAl8cuc5IrmE4Sl52bhqf5rgn8/g83JyK6k4mtTVrrMMiF/hYmIqgjEAXu3frZHLPMWqYZH5Oujlrl1nEhjnMwSdCPtgs= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 02:07:56 localhost python3[18511]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548798.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Dec 6 02:07:58 localhost python3[18561]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:07:58 localhost python3[18604]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765004878.1903267-135-229465344606650/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=2b5c0700ed3142a5a7a0fdfa0b31b16d_id_rsa follow=False checksum=b1020d9647c4828469ac2d677aaf7236891e262d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:08:00 localhost python3[18666]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:08:00 localhost python3[18709]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765004879.83609-221-185626212315219/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=2b5c0700ed3142a5a7a0fdfa0b31b16d_id_rsa.pub follow=False checksum=da77701aa35ae4889a68f92ec70a04195da6965c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:08:02 localhost python3[18739]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:08:03 localhost python3[18785]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:08:03 localhost python3[18801]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpz_npmpgd recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:08:04 localhost python3[18861]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:08:05 localhost python3[18877]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpn574qpcp recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:08:06 localhost python3[18937]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:08:06 localhost python3[18953]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmp30q6emsu recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:08:07 localhost systemd[1]: session-8.scope: Deactivated successfully. Dec 6 02:08:07 localhost systemd[1]: session-8.scope: Consumed 3.805s CPU time. Dec 6 02:08:07 localhost systemd-logind[760]: Session 8 logged out. Waiting for processes to exit. Dec 6 02:08:07 localhost systemd-logind[760]: Removed session 8. Dec 6 02:10:34 localhost sshd[18970]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:10:35 localhost systemd-logind[760]: New session 9 of user zuul. Dec 6 02:10:35 localhost systemd[1]: Started Session 9 of User zuul. Dec 6 02:10:35 localhost python3[19016]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:15:34 localhost systemd[1]: session-9.scope: Deactivated successfully. Dec 6 02:15:34 localhost systemd-logind[760]: Session 9 logged out. Waiting for processes to exit. Dec 6 02:15:34 localhost systemd-logind[760]: Removed session 9. Dec 6 02:22:43 localhost sshd[19024]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:22:44 localhost systemd-logind[760]: New session 10 of user zuul. Dec 6 02:22:44 localhost systemd[1]: Started Session 10 of User zuul. Dec 6 02:22:44 localhost python3[19041]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163efc-24cc-293f-caca-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:22:45 localhost python3[19061]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163efc-24cc-293f-caca-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:22:50 localhost python3[19081]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Dec 6 02:22:53 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:22:54 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:23:48 localhost python3[19239]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Dec 6 02:23:51 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:23:51 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:00 localhost python3[19439]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Dec 6 02:24:02 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:02 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:08 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:08 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:31 localhost python3[19774]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Dec 6 02:24:34 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:34 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:40 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:24:40 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:25:03 localhost python3[20169]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Dec 6 02:25:06 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:25:06 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:25:12 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:25:12 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:25:37 localhost python3[20509]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163efc-24cc-293f-caca-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:25:42 localhost python3[20528]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 02:25:55 localhost systemd[1]: Starting dnf makecache... Dec 6 02:25:55 localhost dnf[20610]: Updating Subscription Management repositories. Dec 6 02:25:57 localhost dnf[20610]: Failed determining last makecache time. Dec 6 02:25:57 localhost dnf[20610]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 28 kB/s | 4.1 kB 00:00 Dec 6 02:25:58 localhost dnf[20610]: Red Hat Enterprise Linux 9 for x86_64 - High Av 6.0 kB/s | 4.0 kB 00:00 Dec 6 02:25:58 localhost dnf[20610]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_ 17 kB/s | 4.0 kB 00:00 Dec 6 02:25:58 localhost dnf[20610]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 33 kB/s | 4.5 kB 00:00 Dec 6 02:25:59 localhost dnf[20610]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 29 kB/s | 4.5 kB 00:00 Dec 6 02:25:59 localhost dnf[20610]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 30 kB/s | 4.1 kB 00:00 Dec 6 02:25:59 localhost dnf[20610]: Fast Datapath for RHEL 9 x86_64 (RPMs) 24 kB/s | 4.0 kB 00:00 Dec 6 02:25:59 localhost dnf[20610]: Metadata cache created. Dec 6 02:25:59 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Dec 6 02:25:59 localhost systemd[1]: Finished dnf makecache. Dec 6 02:25:59 localhost systemd[1]: dnf-makecache.service: Consumed 2.701s CPU time. Dec 6 02:26:04 localhost kernel: SELinux: Converting 488 SID table entries... Dec 6 02:26:04 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 02:26:04 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 02:26:04 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 02:26:04 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 02:26:04 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 02:26:04 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 02:26:04 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 02:26:05 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=4 res=1 Dec 6 02:26:05 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Dec 6 02:26:09 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 02:26:09 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 02:26:09 localhost systemd[1]: Reloading. Dec 6 02:26:09 localhost systemd-sysv-generator[21187]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 02:26:09 localhost systemd-rc-local-generator[21180]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 02:26:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 02:26:09 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 02:26:10 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 02:26:10 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 02:26:10 localhost systemd[1]: run-r6375a4e0958941c1a3cea160297c5f1f.service: Deactivated successfully. Dec 6 02:26:11 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:26:11 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 02:26:38 localhost python3[21745]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163efc-24cc-293f-caca-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:27:16 localhost python3[21765]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:27:17 localhost python3[21813]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:27:17 localhost python3[21856]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765006036.7518-291-58434481930345/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:27:18 localhost python3[21886]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 6 02:27:18 localhost systemd-journald[619]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Dec 6 02:27:18 localhost systemd-journald[619]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 02:27:18 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 02:27:18 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 02:27:19 localhost python3[21907]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 6 02:27:19 localhost python3[21927]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 6 02:27:19 localhost python3[21947]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 6 02:27:20 localhost python3[21967]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 6 02:27:22 localhost python3[21987]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 02:27:22 localhost systemd[1]: Starting LSB: Bring up/down networking... Dec 6 02:27:22 localhost network[21990]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 02:27:22 localhost network[22001]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 02:27:22 localhost network[21990]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:22 localhost network[22002]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:22 localhost network[21990]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Dec 6 02:27:22 localhost network[22003]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 02:27:22 localhost NetworkManager[5965]: [1765006042.5750] audit: op="connections-reload" pid=22031 uid=0 result="success" Dec 6 02:27:22 localhost network[21990]: Bringing up loopback interface: [ OK ] Dec 6 02:27:22 localhost NetworkManager[5965]: [1765006042.7987] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22119 uid=0 result="success" Dec 6 02:27:22 localhost network[21990]: Bringing up interface eth0: [ OK ] Dec 6 02:27:22 localhost systemd[1]: Started LSB: Bring up/down networking. Dec 6 02:27:23 localhost python3[22160]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 02:27:23 localhost systemd[1]: Starting Open vSwitch Database Unit... Dec 6 02:27:23 localhost chown[22164]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Dec 6 02:27:23 localhost ovs-ctl[22169]: /etc/openvswitch/conf.db does not exist ... (warning). Dec 6 02:27:23 localhost ovs-ctl[22169]: Creating empty database /etc/openvswitch/conf.db [ OK ] Dec 6 02:27:23 localhost ovs-ctl[22169]: Starting ovsdb-server [ OK ] Dec 6 02:27:23 localhost ovs-vsctl[22218]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Dec 6 02:27:23 localhost ovs-vsctl[22238]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"a1cf5a35-de45-4f36-ac91-02296203a661\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Dec 6 02:27:23 localhost ovs-ctl[22169]: Configuring Open vSwitch system IDs [ OK ] Dec 6 02:27:23 localhost ovs-ctl[22169]: Enabling remote OVSDB managers [ OK ] Dec 6 02:27:23 localhost systemd[1]: Started Open vSwitch Database Unit. Dec 6 02:27:23 localhost ovs-vsctl[22244]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005548798.novalocal Dec 6 02:27:23 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Dec 6 02:27:23 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Dec 6 02:27:23 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Dec 6 02:27:23 localhost kernel: openvswitch: Open vSwitch switching datapath Dec 6 02:27:23 localhost ovs-ctl[22288]: Inserting openvswitch module [ OK ] Dec 6 02:27:23 localhost ovs-ctl[22257]: Starting ovs-vswitchd [ OK ] Dec 6 02:27:23 localhost ovs-vsctl[22307]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005548798.novalocal Dec 6 02:27:23 localhost ovs-ctl[22257]: Enabling remote OVSDB managers [ OK ] Dec 6 02:27:23 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Dec 6 02:27:23 localhost systemd[1]: Starting Open vSwitch... Dec 6 02:27:23 localhost systemd[1]: Finished Open vSwitch. Dec 6 02:27:27 localhost python3[22325]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163efc-24cc-293f-caca-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:27:28 localhost NetworkManager[5965]: [1765006048.4162] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22483 uid=0 result="success" Dec 6 02:27:28 localhost ifup[22484]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:28 localhost ifup[22485]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:28 localhost ifup[22486]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:28 localhost NetworkManager[5965]: [1765006048.4492] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22492 uid=0 result="success" Dec 6 02:27:28 localhost ovs-vsctl[22494]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:33:96:90 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Dec 6 02:27:28 localhost kernel: device ovs-system entered promiscuous mode Dec 6 02:27:28 localhost kernel: Timeout policy base is empty Dec 6 02:27:28 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Dec 6 02:27:28 localhost NetworkManager[5965]: [1765006048.4783] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Dec 6 02:27:28 localhost systemd-udevd[22496]: Network interface NamePolicy= disabled on kernel command line. Dec 6 02:27:28 localhost kernel: device br-ex entered promiscuous mode Dec 6 02:27:28 localhost NetworkManager[5965]: [1765006048.5252] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Dec 6 02:27:28 localhost NetworkManager[5965]: [1765006048.5535] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22522 uid=0 result="success" Dec 6 02:27:28 localhost NetworkManager[5965]: [1765006048.5756] device (br-ex): carrier: link connected Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.6292] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22551 uid=0 result="success" Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.6770] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22566 uid=0 result="success" Dec 6 02:27:31 localhost NET[22591]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.7694] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.7898] dhcp4 (eth1): canceled DHCP transaction Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.7899] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.7899] dhcp4 (eth1): state changed no lease Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.7945] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22600 uid=0 result="success" Dec 6 02:27:31 localhost ifup[22601]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:31 localhost ifup[22602]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:31 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 6 02:27:31 localhost ifup[22603]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:31 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.8317] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22617 uid=0 result="success" Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.9240] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22628 uid=0 result="success" Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.9300] device (eth1): carrier: link connected Dec 6 02:27:31 localhost NetworkManager[5965]: [1765006051.9517] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22637 uid=0 result="success" Dec 6 02:27:31 localhost ipv6_wait_tentative[22649]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Dec 6 02:27:32 localhost ipv6_wait_tentative[22654]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.0280] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22664 uid=0 result="success" Dec 6 02:27:34 localhost ovs-vsctl[22679]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Dec 6 02:27:34 localhost kernel: device eth1 entered promiscuous mode Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.1139] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22686 uid=0 result="success" Dec 6 02:27:34 localhost ifup[22687]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:34 localhost ifup[22688]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:34 localhost ifup[22689]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.1490] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22695 uid=0 result="success" Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.1975] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22705 uid=0 result="success" Dec 6 02:27:34 localhost ifup[22706]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:34 localhost ifup[22707]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:34 localhost ifup[22708]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.2308] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22714 uid=0 result="success" Dec 6 02:27:34 localhost ovs-vsctl[22717]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Dec 6 02:27:34 localhost kernel: device vlan21 entered promiscuous mode Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.2743] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Dec 6 02:27:34 localhost systemd-udevd[22719]: Network interface NamePolicy= disabled on kernel command line. Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.3044] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22728 uid=0 result="success" Dec 6 02:27:34 localhost NetworkManager[5965]: [1765006054.3280] device (vlan21): carrier: link connected Dec 6 02:27:37 localhost NetworkManager[5965]: [1765006057.3849] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22757 uid=0 result="success" Dec 6 02:27:37 localhost NetworkManager[5965]: [1765006057.4388] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22772 uid=0 result="success" Dec 6 02:27:37 localhost NetworkManager[5965]: [1765006057.5067] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22793 uid=0 result="success" Dec 6 02:27:37 localhost ifup[22794]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:37 localhost ifup[22795]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:37 localhost ifup[22796]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:37 localhost NetworkManager[5965]: [1765006057.5414] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22802 uid=0 result="success" Dec 6 02:27:37 localhost ovs-vsctl[22805]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Dec 6 02:27:37 localhost kernel: device vlan23 entered promiscuous mode Dec 6 02:27:37 localhost NetworkManager[5965]: [1765006057.5807] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Dec 6 02:27:37 localhost systemd-udevd[22807]: Network interface NamePolicy= disabled on kernel command line. Dec 6 02:27:37 localhost NetworkManager[5965]: [1765006057.6101] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22817 uid=0 result="success" Dec 6 02:27:37 localhost NetworkManager[5965]: [1765006057.6288] device (vlan23): carrier: link connected Dec 6 02:27:40 localhost NetworkManager[5965]: [1765006060.6777] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22847 uid=0 result="success" Dec 6 02:27:40 localhost NetworkManager[5965]: [1765006060.7272] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22862 uid=0 result="success" Dec 6 02:27:40 localhost NetworkManager[5965]: [1765006060.7906] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22883 uid=0 result="success" Dec 6 02:27:40 localhost ifup[22884]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:40 localhost ifup[22885]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:40 localhost ifup[22886]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:40 localhost NetworkManager[5965]: [1765006060.8234] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22892 uid=0 result="success" Dec 6 02:27:40 localhost ovs-vsctl[22895]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Dec 6 02:27:40 localhost systemd-udevd[22897]: Network interface NamePolicy= disabled on kernel command line. Dec 6 02:27:40 localhost NetworkManager[5965]: [1765006060.8586] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Dec 6 02:27:40 localhost kernel: device vlan20 entered promiscuous mode Dec 6 02:27:40 localhost NetworkManager[5965]: [1765006060.8864] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22907 uid=0 result="success" Dec 6 02:27:40 localhost NetworkManager[5965]: [1765006060.9084] device (vlan20): carrier: link connected Dec 6 02:27:41 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 6 02:27:43 localhost NetworkManager[5965]: [1765006063.9655] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22937 uid=0 result="success" Dec 6 02:27:44 localhost NetworkManager[5965]: [1765006064.0106] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22952 uid=0 result="success" Dec 6 02:27:44 localhost NetworkManager[5965]: [1765006064.0589] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22973 uid=0 result="success" Dec 6 02:27:44 localhost ifup[22974]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:44 localhost ifup[22975]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:44 localhost ifup[22976]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:44 localhost NetworkManager[5965]: [1765006064.0854] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22982 uid=0 result="success" Dec 6 02:27:44 localhost ovs-vsctl[22985]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Dec 6 02:27:44 localhost kernel: device vlan22 entered promiscuous mode Dec 6 02:27:44 localhost systemd-udevd[22987]: Network interface NamePolicy= disabled on kernel command line. Dec 6 02:27:44 localhost NetworkManager[5965]: [1765006064.1218] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Dec 6 02:27:44 localhost NetworkManager[5965]: [1765006064.1509] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22997 uid=0 result="success" Dec 6 02:27:44 localhost NetworkManager[5965]: [1765006064.1740] device (vlan22): carrier: link connected Dec 6 02:27:47 localhost NetworkManager[5965]: [1765006067.2167] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23027 uid=0 result="success" Dec 6 02:27:47 localhost NetworkManager[5965]: [1765006067.2583] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23042 uid=0 result="success" Dec 6 02:27:47 localhost NetworkManager[5965]: [1765006067.3096] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23063 uid=0 result="success" Dec 6 02:27:47 localhost ifup[23064]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:47 localhost ifup[23065]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:47 localhost ifup[23066]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:47 localhost NetworkManager[5965]: [1765006067.3371] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23072 uid=0 result="success" Dec 6 02:27:47 localhost ovs-vsctl[23075]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Dec 6 02:27:47 localhost systemd-udevd[23077]: Network interface NamePolicy= disabled on kernel command line. Dec 6 02:27:47 localhost kernel: device vlan44 entered promiscuous mode Dec 6 02:27:47 localhost NetworkManager[5965]: [1765006067.3728] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Dec 6 02:27:47 localhost NetworkManager[5965]: [1765006067.3963] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23087 uid=0 result="success" Dec 6 02:27:47 localhost NetworkManager[5965]: [1765006067.4169] device (vlan44): carrier: link connected Dec 6 02:27:50 localhost NetworkManager[5965]: [1765006070.4713] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23117 uid=0 result="success" Dec 6 02:27:50 localhost NetworkManager[5965]: [1765006070.5120] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23132 uid=0 result="success" Dec 6 02:27:50 localhost NetworkManager[5965]: [1765006070.5726] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23153 uid=0 result="success" Dec 6 02:27:50 localhost ifup[23154]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:50 localhost ifup[23155]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:50 localhost ifup[23156]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:50 localhost NetworkManager[5965]: [1765006070.6067] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23162 uid=0 result="success" Dec 6 02:27:50 localhost ovs-vsctl[23165]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Dec 6 02:27:50 localhost NetworkManager[5965]: [1765006070.6902] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23172 uid=0 result="success" Dec 6 02:27:51 localhost NetworkManager[5965]: [1765006071.7516] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23199 uid=0 result="success" Dec 6 02:27:51 localhost NetworkManager[5965]: [1765006071.8032] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23214 uid=0 result="success" Dec 6 02:27:51 localhost NetworkManager[5965]: [1765006071.8681] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23235 uid=0 result="success" Dec 6 02:27:51 localhost ifup[23236]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:51 localhost ifup[23237]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:51 localhost ifup[23238]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:51 localhost NetworkManager[5965]: [1765006071.9011] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23244 uid=0 result="success" Dec 6 02:27:51 localhost ovs-vsctl[23247]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Dec 6 02:27:51 localhost NetworkManager[5965]: [1765006071.9974] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23254 uid=0 result="success" Dec 6 02:27:53 localhost NetworkManager[5965]: [1765006073.0601] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23282 uid=0 result="success" Dec 6 02:27:53 localhost NetworkManager[5965]: [1765006073.1094] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23297 uid=0 result="success" Dec 6 02:27:53 localhost NetworkManager[5965]: [1765006073.1637] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23318 uid=0 result="success" Dec 6 02:27:53 localhost ifup[23319]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:53 localhost ifup[23320]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:53 localhost ifup[23321]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:53 localhost NetworkManager[5965]: [1765006073.1872] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23327 uid=0 result="success" Dec 6 02:27:53 localhost ovs-vsctl[23330]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Dec 6 02:27:53 localhost NetworkManager[5965]: [1765006073.2360] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23337 uid=0 result="success" Dec 6 02:27:54 localhost NetworkManager[5965]: [1765006074.2931] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23365 uid=0 result="success" Dec 6 02:27:54 localhost NetworkManager[5965]: [1765006074.3373] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23380 uid=0 result="success" Dec 6 02:27:54 localhost NetworkManager[5965]: [1765006074.3964] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23401 uid=0 result="success" Dec 6 02:27:54 localhost ifup[23402]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:54 localhost ifup[23403]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:54 localhost ifup[23404]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:54 localhost NetworkManager[5965]: [1765006074.4263] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23410 uid=0 result="success" Dec 6 02:27:54 localhost ovs-vsctl[23413]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Dec 6 02:27:54 localhost NetworkManager[5965]: [1765006074.5072] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23420 uid=0 result="success" Dec 6 02:27:55 localhost NetworkManager[5965]: [1765006075.5673] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23448 uid=0 result="success" Dec 6 02:27:55 localhost NetworkManager[5965]: [1765006075.6156] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23463 uid=0 result="success" Dec 6 02:27:55 localhost NetworkManager[5965]: [1765006075.6814] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23484 uid=0 result="success" Dec 6 02:27:55 localhost ifup[23485]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 6 02:27:55 localhost ifup[23486]: 'network-scripts' will be removed from distribution in near future. Dec 6 02:27:55 localhost ifup[23487]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 6 02:27:55 localhost NetworkManager[5965]: [1765006075.7137] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23493 uid=0 result="success" Dec 6 02:27:55 localhost ovs-vsctl[23496]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Dec 6 02:27:55 localhost NetworkManager[5965]: [1765006075.8051] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23503 uid=0 result="success" Dec 6 02:27:56 localhost NetworkManager[5965]: [1765006076.8649] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23531 uid=0 result="success" Dec 6 02:27:56 localhost NetworkManager[5965]: [1765006076.9051] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23546 uid=0 result="success" Dec 6 02:28:49 localhost python3[23578]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163efc-24cc-293f-caca-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:28:55 localhost python3[23597]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvWmMz5w6QtZ34EwGejcZ8GA5D9qCDnrR9SN9NoDrchlEWc8l+jVWPmOw3cMQtAEUjYup19NA50t+nf7/BZclio4f2BWDnEZCtg6L+G7p0gN+kFt53NX/JM495f2BE9WTFg+Tti5UXaDpxPu9rJ0+MOl34hDM+GWIahiZbHyI4eaP4ionlBXJHg35XRAYG5gL9aBZy0OxIhecrDjU0zQoZVs7s2TlRr4q+ZnbL8L2yXdNXyEkCDTXB32zGOfVy5uT0zpIsT5FcmTvV3pWj5I1XZowQzhImQdXBvWu652LBAxZwewm9caDh0RtZnfC4yhfFhg3FiENWqnZ7pYX8GU/GZAKhrc7uLLcDyhL8LlYPjpZdiL+F0l/8zbzcD1AcVU2nKMl5ld6GEo8lxnb/nkJtT5dqX6l/C/kxSZfAl8cuc5IrmE4Sl52bhqf5rgn8/g83JyK6k4mtTVrrMMiF/hYmIqgjEAXu3frZHLPMWqYZH5Oujlrl1nEhjnMwSdCPtgs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 02:28:56 localhost python3[23613]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvWmMz5w6QtZ34EwGejcZ8GA5D9qCDnrR9SN9NoDrchlEWc8l+jVWPmOw3cMQtAEUjYup19NA50t+nf7/BZclio4f2BWDnEZCtg6L+G7p0gN+kFt53NX/JM495f2BE9WTFg+Tti5UXaDpxPu9rJ0+MOl34hDM+GWIahiZbHyI4eaP4ionlBXJHg35XRAYG5gL9aBZy0OxIhecrDjU0zQoZVs7s2TlRr4q+ZnbL8L2yXdNXyEkCDTXB32zGOfVy5uT0zpIsT5FcmTvV3pWj5I1XZowQzhImQdXBvWu652LBAxZwewm9caDh0RtZnfC4yhfFhg3FiENWqnZ7pYX8GU/GZAKhrc7uLLcDyhL8LlYPjpZdiL+F0l/8zbzcD1AcVU2nKMl5ld6GEo8lxnb/nkJtT5dqX6l/C/kxSZfAl8cuc5IrmE4Sl52bhqf5rgn8/g83JyK6k4mtTVrrMMiF/hYmIqgjEAXu3frZHLPMWqYZH5Oujlrl1nEhjnMwSdCPtgs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 02:28:57 localhost python3[23627]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvWmMz5w6QtZ34EwGejcZ8GA5D9qCDnrR9SN9NoDrchlEWc8l+jVWPmOw3cMQtAEUjYup19NA50t+nf7/BZclio4f2BWDnEZCtg6L+G7p0gN+kFt53NX/JM495f2BE9WTFg+Tti5UXaDpxPu9rJ0+MOl34hDM+GWIahiZbHyI4eaP4ionlBXJHg35XRAYG5gL9aBZy0OxIhecrDjU0zQoZVs7s2TlRr4q+ZnbL8L2yXdNXyEkCDTXB32zGOfVy5uT0zpIsT5FcmTvV3pWj5I1XZowQzhImQdXBvWu652LBAxZwewm9caDh0RtZnfC4yhfFhg3FiENWqnZ7pYX8GU/GZAKhrc7uLLcDyhL8LlYPjpZdiL+F0l/8zbzcD1AcVU2nKMl5ld6GEo8lxnb/nkJtT5dqX6l/C/kxSZfAl8cuc5IrmE4Sl52bhqf5rgn8/g83JyK6k4mtTVrrMMiF/hYmIqgjEAXu3frZHLPMWqYZH5Oujlrl1nEhjnMwSdCPtgs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 02:28:58 localhost python3[23643]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvWmMz5w6QtZ34EwGejcZ8GA5D9qCDnrR9SN9NoDrchlEWc8l+jVWPmOw3cMQtAEUjYup19NA50t+nf7/BZclio4f2BWDnEZCtg6L+G7p0gN+kFt53NX/JM495f2BE9WTFg+Tti5UXaDpxPu9rJ0+MOl34hDM+GWIahiZbHyI4eaP4ionlBXJHg35XRAYG5gL9aBZy0OxIhecrDjU0zQoZVs7s2TlRr4q+ZnbL8L2yXdNXyEkCDTXB32zGOfVy5uT0zpIsT5FcmTvV3pWj5I1XZowQzhImQdXBvWu652LBAxZwewm9caDh0RtZnfC4yhfFhg3FiENWqnZ7pYX8GU/GZAKhrc7uLLcDyhL8LlYPjpZdiL+F0l/8zbzcD1AcVU2nKMl5ld6GEo8lxnb/nkJtT5dqX6l/C/kxSZfAl8cuc5IrmE4Sl52bhqf5rgn8/g83JyK6k4mtTVrrMMiF/hYmIqgjEAXu3frZHLPMWqYZH5Oujlrl1nEhjnMwSdCPtgs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 6 02:28:59 localhost python3[23657]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Dec 6 02:29:00 localhost python3[23672]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005548798.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163efc-24cc-293f-caca-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:29:00 localhost python3[23692]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.ooo.test"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-293f-caca-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:29:00 localhost systemd[1]: Starting Hostname Service... Dec 6 02:29:00 localhost systemd[1]: Started Hostname Service. Dec 6 02:29:00 localhost systemd-hostnamed[23696]: Hostname set to (static) Dec 6 02:29:00 localhost NetworkManager[5965]: [1765006140.9605] hostname: static hostname changed from "np0005548798.novalocal" to "np0005548798.ooo.test" Dec 6 02:29:00 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 6 02:29:00 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 6 02:29:02 localhost systemd[1]: session-10.scope: Deactivated successfully. Dec 6 02:29:02 localhost systemd[1]: session-10.scope: Consumed 1min 44.704s CPU time. Dec 6 02:29:02 localhost systemd-logind[760]: Session 10 logged out. Waiting for processes to exit. Dec 6 02:29:02 localhost systemd-logind[760]: Removed session 10. Dec 6 02:29:05 localhost sshd[23707]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:29:05 localhost systemd[1]: Started Session 11 of User zuul. Dec 6 02:29:05 localhost systemd-logind[760]: New session 11 of user zuul. Dec 6 02:29:05 localhost python3[23724]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Dec 6 02:29:07 localhost systemd[1]: session-11.scope: Deactivated successfully. Dec 6 02:29:07 localhost systemd-logind[760]: Session 11 logged out. Waiting for processes to exit. Dec 6 02:29:07 localhost systemd-logind[760]: Removed session 11. Dec 6 02:29:11 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 6 02:29:30 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 6 02:30:14 localhost sshd[23729]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:30:14 localhost systemd-logind[760]: New session 12 of user zuul. Dec 6 02:30:14 localhost systemd[1]: Started Session 12 of User zuul. Dec 6 02:30:15 localhost python3[23748]: ansible-ansible.legacy.dnf Invoked with name=['ipa-client'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 02:30:23 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 02:30:23 localhost dbus-broker-launch[18427]: Noticed file-system modification, trigger reload. Dec 6 02:30:23 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 02:30:23 localhost dbus-broker-launch[18427]: Service file '/usr/share/dbus-1/services/certmonger.service' is not named after the D-Bus name 'org.fedorahosted.certmonger'. Dec 6 02:30:23 localhost dbus-broker-launch[18427]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Dec 6 02:30:23 localhost dbus-broker-launch[18427]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Dec 6 02:30:23 localhost systemd[1]: Reloading. Dec 6 02:30:23 localhost systemd-sysv-generator[23852]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 02:30:23 localhost systemd-rc-local-generator[23848]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 02:30:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 02:30:32 localhost kernel: SELinux: Converting 538 SID table entries... Dec 6 02:30:32 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 02:30:32 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 02:30:32 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 02:30:32 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 02:30:32 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 02:30:32 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 02:30:32 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 02:30:33 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 02:30:33 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=5 res=1 Dec 6 02:30:33 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 6 02:30:33 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 6 02:30:33 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 6 02:30:33 localhost systemd[1]: sshd.service: Consumed 1.046s CPU time. Dec 6 02:30:33 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 6 02:30:33 localhost systemd[1]: Stopping sshd-keygen.target... Dec 6 02:30:33 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 02:30:33 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 02:30:33 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 02:30:33 localhost systemd[1]: Reached target sshd-keygen.target. Dec 6 02:30:33 localhost systemd[1]: Starting OpenSSH server daemon... Dec 6 02:30:33 localhost sshd[23884]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:30:33 localhost systemd[1]: Started OpenSSH server daemon. Dec 6 02:30:34 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 02:30:34 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 02:30:34 localhost systemd[1]: Reloading. Dec 6 02:30:34 localhost systemd-rc-local-generator[24416]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 02:30:34 localhost systemd-sysv-generator[24422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 02:30:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 02:30:35 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 02:30:35 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 02:30:36 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 02:30:36 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 02:30:36 localhost systemd[1]: man-db-cache-update.service: Consumed 1.716s CPU time. Dec 6 02:30:36 localhost systemd[1]: run-r5f065adebba64b499596fd25ffc56a3a.service: Deactivated successfully. Dec 6 02:30:36 localhost systemd[1]: run-r133525337a7549478df33bd3e216a052.service: Deactivated successfully. Dec 6 02:31:36 localhost systemd[1]: session-12.scope: Deactivated successfully. Dec 6 02:31:36 localhost systemd[1]: session-12.scope: Consumed 17.684s CPU time. Dec 6 02:31:36 localhost systemd-logind[760]: Session 12 logged out. Waiting for processes to exit. Dec 6 02:31:36 localhost systemd-logind[760]: Removed session 12. Dec 6 02:31:40 localhost sshd[25850]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:31:40 localhost systemd-logind[760]: New session 13 of user zuul. Dec 6 02:31:40 localhost systemd[1]: Started Session 13 of User zuul. Dec 6 02:31:41 localhost python3[25901]: ansible-ansible.legacy.stat Invoked with path=/etc/resolv.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:31:41 localhost python3[25946]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765006300.9430037-62-5305630837865/source dest=/etc/resolv.conf owner=root group=root mode=420 follow=False _original_basename=ipa_resolv_conf.j2 checksum=edbe9a45130a7ebff5948216d04bc5bb05808e49 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:31:44 localhost python3[25976]: ansible-ansible.legacy.command Invoked with _raw_params=ip route add 10.255.255.25 via 192.168.122.100 dev br-ex _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 02:31:46 localhost python3[25994]: ansible-ansible.builtin.file Invoked with path=/etc/pki/CA state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:32:46 localhost systemd[1]: session-13.scope: Deactivated successfully. Dec 6 02:32:46 localhost systemd[1]: session-13.scope: Consumed 1.147s CPU time. Dec 6 02:32:46 localhost systemd-logind[760]: Session 13 logged out. Waiting for processes to exit. Dec 6 02:32:46 localhost systemd-logind[760]: Removed session 13. Dec 6 02:59:39 localhost sshd[26002]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:59:39 localhost systemd-logind[760]: New session 14 of user zuul. Dec 6 02:59:39 localhost systemd[1]: Started Session 14 of User zuul. Dec 6 02:59:40 localhost python3[26050]: ansible-ansible.legacy.ping Invoked with data=pong Dec 6 02:59:41 localhost python3[26095]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 02:59:41 localhost python3[26115]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548798.ooo.test update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Dec 6 02:59:41 localhost systemd-journald[619]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 76.6 (255 of 333 items), suggesting rotation. Dec 6 02:59:41 localhost systemd-journald[619]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 02:59:41 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 02:59:41 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 02:59:42 localhost python3[26173]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 02:59:42 localhost python3[26216]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765007981.983428-81928-199636763732954/source _original_basename=tmp4bcrv_y5 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:59:43 localhost python3[26246]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:59:43 localhost python3[26262]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:59:44 localhost python3[26278]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:59:45 localhost python3[26294]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCvWmMz5w6QtZ34EwGejcZ8GA5D9qCDnrR9SN9NoDrchlEWc8l+jVWPmOw3cMQtAEUjYup19NA50t+nf7/BZclio4f2BWDnEZCtg6L+G7p0gN+kFt53NX/JM495f2BE9WTFg+Tti5UXaDpxPu9rJ0+MOl34hDM+GWIahiZbHyI4eaP4ionlBXJHg35XRAYG5gL9aBZy0OxIhecrDjU0zQoZVs7s2TlRr4q+ZnbL8L2yXdNXyEkCDTXB32zGOfVy5uT0zpIsT5FcmTvV3pWj5I1XZowQzhImQdXBvWu652LBAxZwewm9caDh0RtZnfC4yhfFhg3FiENWqnZ7pYX8GU/GZAKhrc7uLLcDyhL8LlYPjpZdiL+F0l/8zbzcD1AcVU2nKMl5ld6GEo8lxnb/nkJtT5dqX6l/C/kxSZfAl8cuc5IrmE4Sl52bhqf5rgn8/g83JyK6k4mtTVrrMMiF/hYmIqgjEAXu3frZHLPMWqYZH5Oujlrl1nEhjnMwSdCPtgs= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 02:59:46 localhost python3[26308]: ansible-ping Invoked with data=pong Dec 6 02:59:57 localhost sshd[26309]: main: sshd: ssh-rsa algorithm is disabled Dec 6 02:59:57 localhost systemd[1]: Created slice User Slice of UID 1002. Dec 6 02:59:57 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Dec 6 02:59:57 localhost systemd-logind[760]: New session 15 of user tripleo-admin. Dec 6 02:59:57 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Dec 6 02:59:57 localhost systemd[1]: Starting User Manager for UID 1002... Dec 6 02:59:57 localhost systemd[26313]: Queued start job for default target Main User Target. Dec 6 02:59:57 localhost systemd[26313]: Created slice User Application Slice. Dec 6 02:59:57 localhost systemd[26313]: Started Mark boot as successful after the user session has run 2 minutes. Dec 6 02:59:57 localhost systemd[26313]: Started Daily Cleanup of User's Temporary Directories. Dec 6 02:59:57 localhost systemd[26313]: Reached target Paths. Dec 6 02:59:57 localhost systemd[26313]: Reached target Timers. Dec 6 02:59:57 localhost systemd[26313]: Starting D-Bus User Message Bus Socket... Dec 6 02:59:57 localhost systemd[26313]: Starting Create User's Volatile Files and Directories... Dec 6 02:59:57 localhost systemd[26313]: Listening on D-Bus User Message Bus Socket. Dec 6 02:59:57 localhost systemd[26313]: Reached target Sockets. Dec 6 02:59:57 localhost systemd[26313]: Finished Create User's Volatile Files and Directories. Dec 6 02:59:57 localhost systemd[26313]: Reached target Basic System. Dec 6 02:59:57 localhost systemd[26313]: Reached target Main User Target. Dec 6 02:59:57 localhost systemd[26313]: Startup finished in 97ms. Dec 6 02:59:57 localhost systemd[1]: Started User Manager for UID 1002. Dec 6 02:59:57 localhost systemd[1]: Started Session 15 of User tripleo-admin. Dec 6 02:59:57 localhost python3[26374]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Dec 6 03:00:02 localhost python3[26394]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Dec 6 03:00:03 localhost python3[26410]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Dec 6 03:00:03 localhost python3[26458]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.jfyt6o6itmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:00:04 localhost python3[26488]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.jfyt6o6itmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:00:05 localhost python3[26504]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.jfyt6o6itmphosts insertbefore=BOF block=172.17.0.106 np0005548798.ooo.test np0005548798#012172.18.0.106 np0005548798.storage.ooo.test np0005548798.storage#012172.17.0.106 np0005548798.internalapi.ooo.test np0005548798.internalapi#012172.19.0.106 np0005548798.tenant.ooo.test np0005548798.tenant#012192.168.122.106 np0005548798.ctlplane.ooo.test np0005548798.ctlplane#012172.17.0.107 np0005548799.ooo.test np0005548799#012172.18.0.107 np0005548799.storage.ooo.test np0005548799.storage#012172.17.0.107 np0005548799.internalapi.ooo.test np0005548799.internalapi#012172.19.0.107 np0005548799.tenant.ooo.test np0005548799.tenant#012192.168.122.107 np0005548799.ctlplane.ooo.test np0005548799.ctlplane#012172.17.0.108 np0005548801.ooo.test np0005548801#012172.18.0.108 np0005548801.storage.ooo.test np0005548801.storage#012172.17.0.108 np0005548801.internalapi.ooo.test np0005548801.internalapi#012172.19.0.108 np0005548801.tenant.ooo.test np0005548801.tenant#012192.168.122.108 np0005548801.ctlplane.ooo.test np0005548801.ctlplane#012172.17.0.103 np0005548795.ooo.test np0005548795#012172.18.0.103 np0005548795.storage.ooo.test np0005548795.storage#012172.20.0.103 np0005548795.storagemgmt.ooo.test np0005548795.storagemgmt#012172.17.0.103 np0005548795.internalapi.ooo.test np0005548795.internalapi#012172.19.0.103 np0005548795.tenant.ooo.test np0005548795.tenant#012172.21.0.103 np0005548795.external.ooo.test np0005548795.external#012192.168.122.103 np0005548795.ctlplane.ooo.test np0005548795.ctlplane#012172.17.0.104 np0005548796.ooo.test np0005548796#012172.18.0.104 np0005548796.storage.ooo.test np0005548796.storage#012172.20.0.104 np0005548796.storagemgmt.ooo.test np0005548796.storagemgmt#012172.17.0.104 np0005548796.internalapi.ooo.test np0005548796.internalapi#012172.19.0.104 np0005548796.tenant.ooo.test np0005548796.tenant#012172.21.0.104 np0005548796.external.ooo.test np0005548796.external#012192.168.122.104 np0005548796.ctlplane.ooo.test np0005548796.ctlplane#012172.17.0.105 np0005548797.ooo.test np0005548797#012172.18.0.105 np0005548797.storage.ooo.test np0005548797.storage#012172.20.0.105 np0005548797.storagemgmt.ooo.test np0005548797.storagemgmt#012172.17.0.105 np0005548797.internalapi.ooo.test np0005548797.internalapi#012172.19.0.105 np0005548797.tenant.ooo.test np0005548797.tenant#012172.21.0.105 np0005548797.external.ooo.test np0005548797.external#012192.168.122.105 np0005548797.ctlplane.ooo.test np0005548797.ctlplane#012#012192.168.122.100 undercloud.ctlplane.ooo.test undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.ooo.test#012172.18.0.122 overcloud.storage.ooo.test#012172.20.0.172 overcloud.storagemgmt.ooo.test#012172.17.0.121 overcloud.internalapi.ooo.test#012172.21.0.213 overcloud.ooo.test#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:00:06 localhost python3[26520]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.jfyt6o6itmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:00:06 localhost python3[26537]: ansible-file Invoked with path=/tmp/ansible.jfyt6o6itmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:00:07 localhost python3[26553]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:00:08 localhost python3[26570]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:00:41 localhost python3[26590]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:00:41 localhost python3[26607]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:00:59 localhost systemd[1]: Reloading. Dec 6 03:00:59 localhost systemd-sysv-generator[26805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:00:59 localhost systemd-rc-local-generator[26801]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:00:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:00:59 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Dec 6 03:01:07 localhost systemd[1]: Reloading. Dec 6 03:01:07 localhost systemd-rc-local-generator[26881]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:01:07 localhost systemd-sysv-generator[26888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:01:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:01:07 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Dec 6 03:01:07 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Dec 6 03:01:07 localhost systemd[1]: Reloading. Dec 6 03:01:07 localhost systemd-sysv-generator[26929]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:01:07 localhost systemd-rc-local-generator[26925]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:01:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:01:07 localhost systemd[1]: Listening on LVM2 poll daemon socket. Dec 6 03:01:56 localhost kernel: SELinux: Converting 2713 SID table entries... Dec 6 03:01:56 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 03:01:56 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 03:01:56 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 03:01:56 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 03:01:56 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 03:01:56 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 03:01:56 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 03:01:57 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=7 res=1 Dec 6 03:01:57 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:01:57 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 03:01:57 localhost systemd[1]: Reloading. Dec 6 03:01:57 localhost systemd-rc-local-generator[27512]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:01:57 localhost systemd-sysv-generator[27518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:01:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:01:57 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 03:01:57 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:01:58 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 03:01:58 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 03:01:58 localhost systemd[1]: run-r91938b6e61654d79a8b3718ff359b300.service: Deactivated successfully. Dec 6 03:01:58 localhost systemd[1]: run-r7cd8eaab1d814384919eb05d90e7478f.service: Deactivated successfully. Dec 6 03:02:05 localhost python3[28380]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:06 localhost python3[28519]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:02:06 localhost systemd[1]: Reloading. Dec 6 03:02:06 localhost systemd-rc-local-generator[28547]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:02:06 localhost systemd-sysv-generator[28550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:02:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:02:08 localhost python3[28573]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:08 localhost systemd[26313]: Starting Mark boot as successful... Dec 6 03:02:08 localhost systemd[26313]: Finished Mark boot as successful. Dec 6 03:02:08 localhost python3[28590]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:10 localhost python3[28607]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 6 03:02:11 localhost python3[28625]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:11 localhost python3[28643]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:12 localhost python3[28661]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 03:02:12 localhost systemd[1]: Reloading Network Manager... Dec 6 03:02:12 localhost NetworkManager[5965]: [1765008132.0611] audit: op="reload" arg="0" pid=28664 uid=0 result="success" Dec 6 03:02:12 localhost NetworkManager[5965]: [1765008132.0619] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Dec 6 03:02:12 localhost NetworkManager[5965]: [1765008132.0619] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Dec 6 03:02:12 localhost systemd[1]: Reloaded Network Manager. Dec 6 03:02:12 localhost python3[28680]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:13 localhost python3[28697]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:02:13 localhost python3[28715]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:02:13 localhost python3[28731]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:14 localhost python3[28747]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Dec 6 03:02:15 localhost python3[28763]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:02:15 localhost python3[28779]: ansible-blockinfile Invoked with path=/tmp/ansible.b641koci block=[192.168.122.106]*,[np0005548798.ctlplane.ooo.test]*,[172.17.0.106]*,[np0005548798.internalapi.ooo.test]*,[172.18.0.106]*,[np0005548798.storage.ooo.test]*,[172.19.0.106]*,[np0005548798.tenant.ooo.test]*,[np0005548798.ooo.test]*,[np0005548798]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUJonDoETSicDbuGTq9YYOlM9+JMNMlppTltMCPPFSDxsaxfHMWFXbVc3BguCERMxx4AM5SbDvo6UsBdnLmfZN7cTtEzeMReJ2/+qBvX57cAqBKD4L/03GSJnTS10UNl3oJJAXN8T2uYrA6PZiUIBctay1auslD+tEj8tx8v4GMF0ZGtRug2hdcFvvV3n3uLoejSA0/wcBeDkDL3ZMfkdKMtC6jZOYuv9O+4tgWcNqb6+NnKvrL4qmDjTff8s+PhsoMwLqAhLIMwZ0/Xe6k5VMIAormSUzoZ+oiV6+Z9G56Ju9sDty+beWeTwRlxg7ZVCfN5OQTd9vK55AfNHGE336g0cvzjVgrPy3JICFQZcuHJxzJMdfVFzaP5nT1aAia0JqFs+rwoXlEH+P7d/n+LotsslEhTUG6gDudBH7qTDhlwU1jKVPrmVYwb+qJ0VipSspEAfOQKO6fGheajO8C+I8lQtvXxUtY308i6Yvwhu+p2S8q6qjeIUaKyDi6JdnK5s=#012[192.168.122.107]*,[np0005548799.ctlplane.ooo.test]*,[172.17.0.107]*,[np0005548799.internalapi.ooo.test]*,[172.18.0.107]*,[np0005548799.storage.ooo.test]*,[172.19.0.107]*,[np0005548799.tenant.ooo.test]*,[np0005548799.ooo.test]*,[np0005548799]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwYtRu2haMMHw4cIuwtRJ6v37XYMvc+5E3altSnM1h1Q+53FIKrKDa9BIODMtjhbtMyN+HN3eWqL60ChMP51Da9JhBJeM9sVVIPXGh97hLpLgZso6OOY5J8o2EHhuzRlJH446hE5M91YUyeDxYmDK7MKIB0stq2wQmtk/ZvwKOtPB+bY465wsL/PWdMdF5Gi6mfVI7/mCQg6Z2eSHSv69B2m4bVTj+BfnWPkLCLLKqRk+gWcOx8JWguNpe6suMytF5nxsUDwNvIhf12owZtdf/Sz3NwCtlMAu1am9ovlHa3kVfcI23+BG0yIKUSCGRrcAZiiuWp2+9lZMy4alxCfIlMzmSlXGvU6l+qXXiw529U13b2jVnawPFfL7ckykHtomrp5aHN5oG57HvXNvG9OT51YgpNoFPymmV25vdPvbiK4M8GID0wN098W4I1wu3gjdYyM0DkpADEsHBwkGSGT2opv7a2WgIbbrFfSrYe1Eld2FNBDuVo458/BosW/JGhCM=#012[192.168.122.108]*,[np0005548801.ctlplane.ooo.test]*,[172.17.0.108]*,[np0005548801.internalapi.ooo.test]*,[172.18.0.108]*,[np0005548801.storage.ooo.test]*,[172.19.0.108]*,[np0005548801.tenant.ooo.test]*,[np0005548801.ooo.test]*,[np0005548801]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDd+aRSgSo36R7PqjzbUFG6PyAV9yxnChvJCvJybGQ0fSqYn6bKlFCiIHXzHKD8SdMAkgHq/v0qRoKF+rPm6k1lIEhMqRCIbIKr7X1ExClL3LRXDU6nZNh1OPKSeyyXdQ4+dbtPwOE9fRbFMmlhp22/WNvpqnFBAHy7ytb5ie8BEEeVsrh0JHPQfx/59QNWSiJTOIi9yb9XyVG8f8C6AfXKSfGVbltlGVWboeIefKKq1fUdRTuQ8CVeyF76G9zniI+2HG6xoark7XcV5VjS3PyloP9UXYrrU57aBpOoM4AmFgMuEnk1x+B1BLODxV6ZAh4/NpO3XTjqnLFCYZxljuPGAB3TPO3/E93qFOYGadTSDlHpaP/eYtknlUJuDK4iGwRTz36NcmRLROhSUSrGb/QI61dDtTwHHk8RyLKqgMZhWVN7CXYYb38/HnMHnWgMiHXSc/xaVuYgAhrBn5cO5losUc3ZZhmgvF1hN4idTmgQ57+wm2VQzLKjyrQmOAOZ+iU=#012[192.168.122.103]*,[np0005548795.ctlplane.ooo.test]*,[172.21.0.103]*,[np0005548795.external.ooo.test]*,[172.17.0.103]*,[np0005548795.internalapi.ooo.test]*,[172.18.0.103]*,[np0005548795.storage.ooo.test]*,[172.20.0.103]*,[np0005548795.storagemgmt.ooo.test]*,[172.19.0.103]*,[np0005548795.tenant.ooo.test]*,[np0005548795.ooo.test]*,[np0005548795]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD3KU9mb6p28DebxEf8mP7XCcYgi43vzXLS1Xubej5l9h+8Rxn/ktXBwRzx/PBl3Jx6i4DJdKsaVn95cgc5pCWvoFdb4KYzCYevvoKyb09GDHVu2Lg7emEiU2vGo5+l6iIq21eQCqydMat04GLM4HxW9iQNb6Wsyosx4ETls3AJT5Xyor+Mb2RO98BphGiFrYdKSkDNqr5O4WxFJZKCLOb7jGtCyU7+ufIWlf6Ek+voyFieFqaT+IQatNZR2Ca6amVR2z+HsXgI3jihiv8hN2JloMneS/xVFbQhPDg0Gz463t4cyOJ1STI/QN6swtr75kFwQxQLatvjFGzMKoRWbfwM3qSzjWfDsHcftGqzMWAoZykr/2DtUm4P3B5tE8tSvNduQeq/QmxlPSaPlYTfFDN08wUl0NL8wp7NfcpndUpbGNOR/U6r+K5Df3OrKuI5rfaBBEd1YpLS4W/ichiMBUBbuR6YEgsMN/CwIDPlMEl2/VJJEK5CRK4vrQEGx7ac1S0=#012[192.168.122.104]*,[np0005548796.ctlplane.ooo.test]*,[172.21.0.104]*,[np0005548796.external.ooo.test]*,[172.17.0.104]*,[np0005548796.internalapi.ooo.test]*,[172.18.0.104]*,[np0005548796.storage.ooo.test]*,[172.20.0.104]*,[np0005548796.storagemgmt.ooo.test]*,[172.19.0.104]*,[np0005548796.tenant.ooo.test]*,[np0005548796.ooo.test]*,[np0005548796]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDP/HZnhHGeahVyDtfOIsLMqJDkZJEo1/EuutMSa4wznu9eEe/zWgWZLNcawK29WJO7GczQI0pDvsi/VrFfOXwuhnY3nwqoWaOnABNpYRc2NsAZ234FpJWCUOI05Jds6Z4v3086SYzyhXQIfayAiVujKg9WBsUNql9QfBX7XUtdG8LUz0u2z2cHpyK1Md4YA69eTdiUb8zqGDM0vpah67KtFr6AT0ZLGCkED32lXMIlLCFdYQs0Rx+76zigjilN0qiJTBu+7uiDEpX0Oux9lXXKbIBA4NHxT2tj9NPSemTYb0yftsEnJvpHL6k9T/Ss7b3pM8khdKuXz/mUGTy8jhBrmNBMmoFYQjPdIAMr8sxNRXFklju3b1s9OXjmPkPII4kTj0vVbGnEsbfuu6K0K34ytAgwK3w9PH1ByrUyMMmRiFK1NtmJoh33TMtNZ5FJfe2bXq8VyRgA91P1DwrO42ycJFYXD1+YLEtNebeCWtxdVNqQ8FdUvccAAUKU848DbmM=#012[192.168.122.105]*,[np0005548797.ctlplane.ooo.test]*,[172.21.0.105]*,[np0005548797.external.ooo.test]*,[172.17.0.105]*,[np0005548797.internalapi.ooo.test]*,[172.18.0.105]*,[np0005548797.storage.ooo.test]*,[172.20.0.105]*,[np0005548797.storagemgmt.ooo.test]*,[172.19.0.105]*,[np0005548797.tenant.ooo.test]*,[np0005548797.ooo.test]*,[np0005548797]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBZTxICKoVZWZG3oNtrLw3Aw+m5nWTJUqSEpXuFquaYOpqPDtsfIQY7Z9Apj4A6n8BJn0joYg2zQhNqs/+O7MNR1mf/ywXJzvZ0BoDWFISNWvWOom/sJKWs6UyxSo9rlAeWH4bAy12WAgF+xmvBaRj74ZTk6AmIw8loYWgRUC8K7r5uVWZ+FWPMqGAeKTjGhFUuWhU4zwB4pLwferi73BmQ32IDnSEOcwMUWbkXoN99JVByb0GXPZlk+wRMc1CrTMzS2rWsDFNqmKAhsL0eKVqNz7sXRA5djfpsSob4SqC96gQpX5lIhfc3CYcFc6HA7SLrGaky/wmmP949K02dqviQeUOqpM4pllYBCJKLZky/vWiGaUqg6aBZ+lSfWxBXz+5HeymsvnJs+UUaYYNF7WoLTAzxKoegITIKgYmip37nNxWApeDVYOQEdGRIlF4Ge7q4ZteT1rk2lWeqUpNMXpeKijqhmAefCfsf4Hpc3t6dPKFvSuHrKv/MzYO1+Zn4Ic=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:16 localhost python3[28795]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.b641koci' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:16 localhost python3[28813]: ansible-file Invoked with path=/tmp/ansible.b641koci state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:17 localhost python3[28829]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 03:02:17 localhost python3[28845]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:18 localhost python3[28863]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:18 localhost python3[28882]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Dec 6 03:02:21 localhost python3[29019]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:21 localhost python3[29036]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:02:27 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 03:02:27 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 03:02:27 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:02:27 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 03:02:27 localhost systemd[1]: Reloading. Dec 6 03:02:28 localhost systemd-sysv-generator[29109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:02:28 localhost systemd-rc-local-generator[29106]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:02:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:02:28 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 03:02:28 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Dec 6 03:02:28 localhost systemd[1]: tuned.service: Deactivated successfully. Dec 6 03:02:28 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Dec 6 03:02:28 localhost systemd[1]: tuned.service: Consumed 1.857s CPU time. Dec 6 03:02:28 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 6 03:02:28 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 03:02:28 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 03:02:28 localhost systemd[1]: run-r1ad43d6f23ee46f2b0116432a86a2c5f.service: Deactivated successfully. Dec 6 03:02:29 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 6 03:02:29 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:02:29 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 03:02:29 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 03:02:29 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 03:02:29 localhost systemd[1]: run-rb7115f6267e24f148fa0b1e4794e6dd5.service: Deactivated successfully. Dec 6 03:02:30 localhost python3[29482]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:02:31 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Dec 6 03:02:31 localhost systemd[1]: tuned.service: Deactivated successfully. Dec 6 03:02:31 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Dec 6 03:02:31 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 6 03:02:33 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 6 03:02:33 localhost python3[29678]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:34 localhost python3[29695]: ansible-slurp Invoked with src=/etc/tuned/active_profile Dec 6 03:02:34 localhost python3[29711]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:02:34 localhost python3[29727]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:36 localhost python3[29747]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:02:37 localhost python3[29764]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:02:38 localhost python3[29780]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:44 localhost python3[29796]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:44 localhost python3[29844]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:45 localhost python3[29889]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008164.6036155-88030-265260668942806/source _original_basename=tmpvib2z9w2 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:45 localhost python3[29919]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:46 localhost python3[29967]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:46 localhost python3[30010]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008166.069037-88192-149257016444293/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=ec7581dd2ae19d4cc037afd566ffa45293194640 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:47 localhost python3[30072]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:47 localhost python3[30115]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008167.077182-88328-146324382685824/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=1d434314b963a3c4658e7e68f8a31feba59fe93e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:48 localhost python3[30177]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:48 localhost python3[30220]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008168.0557668-88328-154955436084126/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=18be652dd73ffe55dcc07c59d1dcfa9590fa06de backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:49 localhost python3[30282]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:49 localhost python3[30325]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008169.0270202-88328-14219463707105/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=391244ef26331d1d45bf2341b8b863c959170426 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:50 localhost python3[30387]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:50 localhost python3[30430]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008170.0882516-88328-142047876930036/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=64f5f309f5137b9e0913cbf22857157ecfa0f1f1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:51 localhost python3[30492]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:51 localhost python3[30535]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008171.0547526-88328-172373709404559/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=fd427a9b493bfb7192f8e0fa219339ecfc6b8378 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:52 localhost python3[30597]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:52 localhost python3[30640]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008172.0374057-88328-108043066676693/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=a7eaf9162b693cd112ee5892a2ae9b334859fc85 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:53 localhost python3[30702]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:53 localhost python3[30745]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008172.9724922-88328-152364021887050/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=8811230569f933cf7983f79c3dae88a45151945c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:54 localhost python3[30807]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:54 localhost python3[30850]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008173.9428856-88328-102015278893223/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:55 localhost python3[30912]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:55 localhost python3[30955]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008174.8903236-88328-149124653761263/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=917f40b103323a95b937170e6d6c53e5ae5aafec backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:56 localhost python3[31017]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:56 localhost python3[31060]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008175.84142-88328-56121414464264/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=e66ad10e709f5b01a7e3bb010dc127c2658ac9aa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:02:57 localhost python3[31090]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:02:57 localhost python3[31138]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:02:58 localhost python3[31181]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008177.6195502-89999-273444973931782/source _original_basename=tmp2zr09a6q follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:03:03 localhost python3[31211]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 03:03:04 localhost python3[31272]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.129.56.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:03:08 localhost python3[31289]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:03:13 localhost python3[31306]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:03:14 localhost python3[31329]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:03:14 localhost python3[31352]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:03:15 localhost python3[31375]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:04:07 localhost python3[31398]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:07 localhost python3[31446]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:07 localhost python3[31464]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpfoz5wt3f recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:08 localhost python3[31494]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:09 localhost python3[31542]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:09 localhost python3[31560]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:10 localhost python3[31622]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:10 localhost python3[31640]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:11 localhost python3[31702]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:11 localhost python3[31720]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:12 localhost python3[31782]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:12 localhost python3[31800]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:12 localhost python3[31862]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:13 localhost python3[31880]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:13 localhost python3[31942]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:14 localhost python3[31960]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:14 localhost python3[32022]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:14 localhost python3[32040]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:15 localhost python3[32102]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:15 localhost python3[32120]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:16 localhost python3[32182]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:16 localhost python3[32200]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:17 localhost python3[32262]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:17 localhost python3[32280]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:17 localhost python3[32342]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:18 localhost python3[32360]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:18 localhost python3[32390]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:04:19 localhost python3[32438]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:19 localhost python3[32456]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpr7nb8ic7 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:22 localhost python3[32486]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:04:26 localhost python3[32503]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:04:26 localhost python3[32521]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:04:27 localhost python3[32539]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:04:27 localhost systemd[1]: Reloading. Dec 6 03:04:27 localhost systemd-sysv-generator[32567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:04:27 localhost systemd-rc-local-generator[32563]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:04:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:04:27 localhost systemd[1]: Starting Netfilter Tables... Dec 6 03:04:27 localhost systemd[1]: Finished Netfilter Tables. Dec 6 03:04:28 localhost python3[32629]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:28 localhost python3[32672]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008267.8479006-94136-268201612359221/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:29 localhost python3[32702]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:04:29 localhost python3[32720]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:04:30 localhost python3[32769]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:30 localhost python3[32812]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008269.8427002-94327-155453960871429/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:31 localhost python3[32874]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:31 localhost python3[32917]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008270.9124277-94446-163117730839660/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:32 localhost python3[32979]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:32 localhost python3[33022]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008272.2037284-94608-138744928217013/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:33 localhost python3[33084]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:33 localhost python3[33127]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008273.2090604-94792-278953880284911/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:34 localhost python3[33189]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:35 localhost python3[33232]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008274.202735-94924-144370127292859/source mode=None follow=False _original_basename=ruleset.j2 checksum=3777114572ae49edb19bdbce5ae072966e4d15fd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:35 localhost python3[33262]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:04:36 localhost python3[33327]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:04:36 localhost python3[33344]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:04:37 localhost python3[33361]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:04:37 localhost python3[33380]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:37 localhost python3[33396]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:38 localhost python3[33412]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:38 localhost python3[33428]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Dec 6 03:04:39 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=8 res=1 Dec 6 03:04:39 localhost python3[33448]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 6 03:04:40 localhost kernel: SELinux: Converting 2717 SID table entries... Dec 6 03:04:40 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 03:04:40 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 03:04:40 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 03:04:40 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 03:04:40 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 03:04:40 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 03:04:40 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 03:04:40 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=9 res=1 Dec 6 03:04:41 localhost python3[33469]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 6 03:04:41 localhost kernel: SELinux: Converting 2717 SID table entries... Dec 6 03:04:41 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 03:04:41 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 03:04:41 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 03:04:41 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 03:04:41 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 03:04:41 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 03:04:41 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 03:04:42 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=10 res=1 Dec 6 03:04:42 localhost python3[33490]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 6 03:04:43 localhost kernel: SELinux: Converting 2717 SID table entries... Dec 6 03:04:43 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 03:04:43 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 03:04:43 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 03:04:43 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 03:04:43 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 03:04:43 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 03:04:43 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 03:04:43 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=11 res=1 Dec 6 03:04:43 localhost python3[33511]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:43 localhost python3[33527]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:44 localhost python3[33543]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:44 localhost python3[33559]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:04:44 localhost python3[33575]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:04:45 localhost python3[33592]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:04:49 localhost python3[33609]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:49 localhost python3[33657]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:50 localhost python3[33700]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008289.387894-95911-273684532560036/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:50 localhost python3[33730]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 03:04:51 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 6 03:04:51 localhost systemd[1]: Stopped Load Kernel Modules. Dec 6 03:04:51 localhost systemd[1]: Stopping Load Kernel Modules... Dec 6 03:04:51 localhost systemd[1]: Starting Load Kernel Modules... Dec 6 03:04:51 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 6 03:04:51 localhost kernel: Bridge firewalling registered Dec 6 03:04:51 localhost systemd-modules-load[33733]: Inserted module 'br_netfilter' Dec 6 03:04:51 localhost systemd-modules-load[33733]: Module 'msr' is built in Dec 6 03:04:51 localhost systemd[1]: Finished Load Kernel Modules. Dec 6 03:04:52 localhost python3[33784]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:04:52 localhost python3[33827]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008291.9561117-96097-105826468466137/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 03:04:53 localhost python3[33857]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:53 localhost python3[33875]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:53 localhost python3[33893]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:54 localhost python3[33911]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:54 localhost python3[33928]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:54 localhost python3[33945]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:55 localhost python3[33962]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:55 localhost python3[33980]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:55 localhost python3[33998]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:56 localhost python3[34016]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:56 localhost python3[34034]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:56 localhost python3[34052]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:56 localhost python3[34070]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:57 localhost python3[34088]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:57 localhost python3[34105]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:57 localhost python3[34122]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:58 localhost python3[34139]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:04:59 localhost python3[34156]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 6 03:05:00 localhost python3[34174]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 03:05:00 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 6 03:05:00 localhost systemd[1]: Stopped Apply Kernel Variables. Dec 6 03:05:00 localhost systemd[1]: Stopping Apply Kernel Variables... Dec 6 03:05:00 localhost systemd[26313]: Created slice User Background Tasks Slice. Dec 6 03:05:00 localhost systemd[1]: Starting Apply Kernel Variables... Dec 6 03:05:00 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 6 03:05:00 localhost systemd[26313]: Starting Cleanup of User's Temporary Files and Directories... Dec 6 03:05:00 localhost systemd[26313]: Finished Cleanup of User's Temporary Files and Directories. Dec 6 03:05:00 localhost systemd[1]: Finished Apply Kernel Variables. Dec 6 03:05:00 localhost python3[34195]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:00 localhost python3[34211]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:01 localhost python3[34227]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:01 localhost python3[34243]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:05:01 localhost python3[34259]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:02 localhost python3[34275]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:02 localhost python3[34291]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:02 localhost python3[34307]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:03 localhost python3[34323]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:03 localhost python3[34371]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:04 localhost python3[34414]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008303.3882163-97059-264036163925847/source _original_basename=tmpe2j_etr8 follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:04 localhost python3[34444]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:05 localhost python3[34461]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:06 localhost python3[34477]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:06 localhost python3[34493]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:06 localhost python3[34509]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:07 localhost python3[34525]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:07 localhost python3[34541]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:07 localhost python3[34557]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:08 localhost python3[34573]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:08 localhost python3[34589]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:08 localhost python3[34605]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:09 localhost python3[34621]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Dec 6 03:05:09 localhost python3[34643]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548798.ooo.test update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Dec 6 03:05:10 localhost python3[34667]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Dec 6 03:05:10 localhost python3[34683]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:10 localhost python3[34732]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:11 localhost python3[34775]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008310.592541-97604-255889133413901/source _original_basename=tmpeko1j1ia follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:11 localhost python3[34805]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Dec 6 03:05:12 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=12 res=1 Dec 6 03:05:12 localhost python3[34825]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:13 localhost python3[34841]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:13 localhost python3[34857]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Dec 6 03:05:15 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=13 res=1 Dec 6 03:05:15 localhost python3[34877]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:05:18 localhost python3[34894]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 03:05:19 localhost python3[34955]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:19 localhost python3[34971]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:19 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:05:20 localhost python3[35030]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:20 localhost python3[35073]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008319.7305286-98209-226487711578972/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=e475f6434e1c413b9337c0e9fa995ef8099cd779 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:21 localhost python3[35135]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:21 localhost python3[35180]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008320.8005638-98299-53571288671836/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:22 localhost python3[35210]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:22 localhost python3[35226]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:22 localhost python3[35242]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:22 localhost python3[35258]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:23 localhost python3[35306]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:24 localhost python3[35349]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008323.248766-98503-162028245333504/source _original_basename=tmp294vo1yr follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:24 localhost python3[35379]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:24 localhost python3[35395]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:25 localhost python3[35411]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:05:28 localhost python3[35460]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:29 localhost python3[35505]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008328.3399174-99110-269446966788195/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=92fbff64c26224c12f8a576f9ce1a758767bd467 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:29 localhost python3[35536]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:05:29 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 6 03:05:29 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 6 03:05:29 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 6 03:05:29 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 6 03:05:29 localhost systemd[1]: Stopping sshd-keygen.target... Dec 6 03:05:29 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 03:05:29 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 03:05:29 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 03:05:29 localhost systemd[1]: Reached target sshd-keygen.target. Dec 6 03:05:29 localhost systemd[1]: Starting OpenSSH server daemon... Dec 6 03:05:29 localhost sshd[35540]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:05:29 localhost systemd[1]: Started OpenSSH server daemon. Dec 6 03:05:30 localhost python3[35556]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:31 localhost python3[35574]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:31 localhost python3[35592]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:05:35 localhost python3[35641]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:35 localhost python3[35686]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008334.705169-99575-76064595117292/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:36 localhost python3[35716]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:05:36 localhost python3[35734]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 03:05:36 localhost chronyd[766]: chronyd exiting Dec 6 03:05:36 localhost systemd[1]: Stopping NTP client/server... Dec 6 03:05:36 localhost systemd[1]: chronyd.service: Deactivated successfully. Dec 6 03:05:36 localhost systemd[1]: Stopped NTP client/server. Dec 6 03:05:36 localhost systemd[1]: chronyd.service: Consumed 122ms CPU time, read 1.9M from disk, written 4.0K to disk. Dec 6 03:05:36 localhost systemd[1]: Starting NTP client/server... Dec 6 03:05:36 localhost chronyd[35741]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 6 03:05:36 localhost chronyd[35741]: Frequency -30.649 +/- 0.057 ppm read from /var/lib/chrony/drift Dec 6 03:05:36 localhost chronyd[35741]: Loaded seccomp filter (level 2) Dec 6 03:05:36 localhost systemd[1]: Started NTP client/server. Dec 6 03:05:37 localhost python3[35790]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:37 localhost python3[35833]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008337.2147477-99859-27905632435344/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:38 localhost python3[35863]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:05:38 localhost systemd[1]: Reloading. Dec 6 03:05:38 localhost systemd-sysv-generator[35889]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:05:38 localhost systemd-rc-local-generator[35884]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:05:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:05:38 localhost systemd[1]: Reloading. Dec 6 03:05:38 localhost systemd-rc-local-generator[35928]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:05:38 localhost systemd-sysv-generator[35932]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:05:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:05:39 localhost systemd[1]: Starting chronyd online sources service... Dec 6 03:05:39 localhost chronyc[35939]: 200 OK Dec 6 03:05:39 localhost systemd[1]: chrony-online.service: Deactivated successfully. Dec 6 03:05:39 localhost systemd[1]: Finished chronyd online sources service. Dec 6 03:05:39 localhost python3[35955]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:39 localhost chronyd[35741]: System clock was stepped by 0.000000 seconds Dec 6 03:05:39 localhost python3[35972]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:41 localhost chronyd[35741]: Selected source 216.128.178.20 (pool.ntp.org) Dec 6 03:05:50 localhost python3[35989]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:50 localhost chronyd[35741]: System clock was stepped by 0.000000 seconds Dec 6 03:05:50 localhost python3[36006]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:51 localhost python3[36023]: ansible-timezone Invoked with name=UTC hwclock=None Dec 6 03:05:51 localhost systemd[1]: Starting Time & Date Service... Dec 6 03:05:51 localhost systemd[1]: Started Time & Date Service. Dec 6 03:05:52 localhost python3[36043]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:52 localhost python3[36060]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:53 localhost python3[36077]: ansible-slurp Invoked with src=/etc/tuned/active_profile Dec 6 03:05:53 localhost python3[36093]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:05:54 localhost python3[36109]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:54 localhost python3[36125]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:55 localhost python3[36173]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:55 localhost python3[36216]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008354.702819-101004-232029693030996/source _original_basename=tmpwu_inct4 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:56 localhost python3[36278]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:56 localhost python3[36321]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008355.7103965-101078-7839395985493/source _original_basename=tmppdmdk9qt follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:57 localhost python3[36351]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 6 03:05:57 localhost systemd[1]: Reloading. Dec 6 03:05:57 localhost systemd-rc-local-generator[36377]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:05:57 localhost systemd-sysv-generator[36380]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:05:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:05:57 localhost python3[36405]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:57 localhost python3[36421]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:58 localhost python3[36438]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:05:58 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Dec 6 03:05:58 localhost python3[36455]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:05:59 localhost python3[36471]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:05:59 localhost python3[36519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:05:59 localhost python3[36562]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008359.2084856-101615-201909764546823/source _original_basename=tmpb1thompe follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:06:21 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Dec 6 03:06:28 localhost python3[36595]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:06:28 localhost python3[36611]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Dec 6 03:06:28 localhost python3[36627]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:06:29 localhost python3[36643]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:06:29 localhost python3[36659]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:06:29 localhost python3[36675]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 6 03:06:30 localhost kernel: SELinux: Converting 2722 SID table entries... Dec 6 03:06:30 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 03:06:30 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 03:06:30 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 03:06:30 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 03:06:30 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 03:06:30 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 03:06:30 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 03:06:31 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=14 res=1 Dec 6 03:06:31 localhost python3[36713]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:06:33 localhost python3[36850]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z Dec 6 03:06:33 localhost rsyslogd[759]: message too long (32242) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Dec 6 03:06:33 localhost python3[36866]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:06:34 localhost python3[36882]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:06:34 localhost python3[36898]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovn_controller.key -c /etc/pki/tls/certs/ovn_controller.crt -C /etc/ipa/ca.crt', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': Tru Dec 6 03:06:34 localhost rsyslogd[759]: message too long (8101) with configured size 8096, begin of message is: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/conf [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Dec 6 03:06:40 localhost python3[36914]: ansible-stat Invoked with path=/etc/ipa/default.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:06:42 localhost python3[36930]: ansible-ipaclient_test Invoked with domain=ooo.test servers=['ipa.ooo.test'] hostname=np0005548798.ooo.test no_ntp=False force_ntpd=False no_nisdomain=False kinit_attempts=5 configure_firefox=False all_ip_addresses=False on_master=False enable_dns_updates=False realm=None ntp_servers=None ntp_pool=None nisdomain=None ca_cert_files=None firefox_dir=None ip_addresses=None Dec 6 03:06:43 localhost python3[36952]: ansible-ansible.builtin.file Invoked with path=/etc/ipa/.dns_ccache state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:06:43 localhost python3[36968]: ansible-ipaclient_setup_ntp Invoked with ntp_servers=[''] ntp_pool= no_ntp=False on_master=False servers=['ipa.ooo.test'] domain=ooo.test Dec 6 03:06:43 localhost systemd[1]: Reloading. Dec 6 03:06:44 localhost systemd-rc-local-generator[36998]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:06:44 localhost systemd-sysv-generator[37002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:06:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:06:44 localhost systemd[1]: Stopping NTP client/server... Dec 6 03:06:44 localhost chronyd[35741]: chronyd exiting Dec 6 03:06:44 localhost systemd[1]: chronyd.service: Deactivated successfully. Dec 6 03:06:44 localhost systemd[1]: Stopped NTP client/server. Dec 6 03:06:44 localhost systemd[1]: Starting NTP client/server... Dec 6 03:06:44 localhost chronyd[37018]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 6 03:06:44 localhost chronyd[37018]: Frequency -30.649 +/- 0.057 ppm read from /var/lib/chrony/drift Dec 6 03:06:44 localhost chronyd[37018]: Loaded seccomp filter (level 2) Dec 6 03:06:44 localhost systemd[1]: Started NTP client/server. Dec 6 03:06:48 localhost chronyd[37018]: Selected source 23.133.168.246 (pool.ntp.org) Dec 6 03:06:51 localhost python3[37037]: ansible-ipaclient_test_keytab Invoked with servers=['ipa.ooo.test'] domain=ooo.test realm=OOO.TEST hostname=np0005548798.ooo.test kdc=ipa.ooo.test kinit_attempts=5 Dec 6 03:06:52 localhost python3[37058]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/ipa-rmkeytab -k /etc/krb5.keytab -r "OOO.TEST"#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:06:53 localhost python3[37075]: ansible-ipaclient_set_hostname Invoked with hostname=np0005548798.ooo.test Dec 6 03:06:53 localhost systemd[1]: Starting Hostname Service... Dec 6 03:06:53 localhost systemd[1]: Started Hostname Service. Dec 6 03:06:53 localhost systemd-hostnamed[37078]: Hostname set to (static) Dec 6 03:06:54 localhost python3[37094]: ansible-ipaclient_temp_krb5 Invoked with servers=['ipa.ooo.test'] domain=ooo.test realm=OOO.TEST hostname=np0005548798.ooo.test kdc=ipa.ooo.test on_master=False Dec 6 03:06:55 localhost python3[37115]: ansible-ipaclient_join Invoked with servers=['ipa.ooo.test'] realm=OOO.TEST basedn=dc=ooo,dc=test hostname=np0005548798.ooo.test force_join=False password=NOT_LOGGING_PARAMETER kinit_attempts=5 krb_name=/tmp/tmphj5qf9pv principal=None keytab=None admin_keytab=None ca_cert_file=None debug=None Dec 6 03:06:55 localhost python3[37115]: ansible-ipaclient_join Enrolled in IPA realm OOO.TEST Dec 6 03:06:56 localhost python3[37136]: ansible-ipaclient_ipa_conf Invoked with servers=['ipa.ooo.test'] domain=ooo.test realm=OOO.TEST hostname=np0005548798.ooo.test basedn=dc=ooo,dc=test Dec 6 03:06:57 localhost python3[37153]: ansible-ipaclient_setup_sssd Invoked with servers=['ipa.ooo.test'] domain=ooo.test realm=OOO.TEST hostname=np0005548798.ooo.test on_master=False no_ssh=False no_sshd=False no_sudo=False all_ip_addresses=False fixed_primary=False permit=False enable_dns_updates=False preserve_sssd=False no_krb5_offline_passwords=False Dec 6 03:06:58 localhost python3[37170]: ansible-ipaclient_api Invoked with servers=['ipa.ooo.test'] realm=OOO.TEST hostname=np0005548798.ooo.test krb_name=/tmp/tmphj5qf9pv debug=False Dec 6 03:07:01 localhost python3[37199]: ansible-ipaclient_setup_nss Invoked with servers=['ipa.ooo.test'] domain=ooo.test realm=OOO.TEST basedn=dc=ooo,dc=test hostname=np0005548798.ooo.test subject_base=O=OOO.TEST principal=admin mkhomedir=False ca_enabled=True on_master=False dnsok=False enable_dns_updates=False all_ip_addresses=False request_cert=False preserve_sssd=False no_ssh=False no_sshd=False no_sudo=False fixed_primary=False permit=False no_krb5_offline_passwords=False no_dns_sshfp=False nosssd_files={} krb_name=/tmp/tmphj5qf9pv ip_addresses=None Dec 6 03:07:02 localhost systemd-journald[619]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 76.0 (253 of 333 items), suggesting rotation. Dec 6 03:07:02 localhost systemd-journald[619]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 03:07:02 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 03:07:02 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 03:07:04 localhost systemd[1]: Starting System Security Services Daemon... Dec 6 03:07:04 localhost sssd[37235]: Starting up Dec 6 03:07:04 localhost sssd_be[37236]: Starting up Dec 6 03:07:04 localhost sssd_pam[37238]: Starting up Dec 6 03:07:04 localhost sssd_sudo[37240]: Starting up Dec 6 03:07:04 localhost sssd_nss[37237]: Starting up Dec 6 03:07:04 localhost sssd_ssh[37239]: Starting up Dec 6 03:07:04 localhost sssd_pac[37241]: Starting up Dec 6 03:07:05 localhost systemd[1]: Started System Security Services Daemon. Dec 6 03:07:05 localhost systemd[1]: Reloading. Dec 6 03:07:05 localhost systemd-rc-local-generator[37269]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:07:05 localhost systemd-sysv-generator[37274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:07:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:07:06 localhost python3[37297]: ansible-ipaclient_setup_ssh Invoked with servers=['ipa.ooo.test'] sssd=True no_ssh=False ssh_trust_dns=False no_sshd=False Dec 6 03:07:06 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 6 03:07:06 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 6 03:07:06 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 6 03:07:06 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 6 03:07:06 localhost systemd[1]: Stopping sshd-keygen.target... Dec 6 03:07:06 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 03:07:06 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 03:07:06 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 03:07:06 localhost systemd[1]: Reached target sshd-keygen.target. Dec 6 03:07:06 localhost systemd[1]: Starting OpenSSH server daemon... Dec 6 03:07:06 localhost sshd[37302]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:07:06 localhost systemd[1]: Started OpenSSH server daemon. Dec 6 03:07:07 localhost python3[37320]: ansible-ipaclient_setup_automount Invoked with servers=['ipa.ooo.test'] sssd=True automount_location=None Dec 6 03:07:07 localhost python3[37337]: ansible-ipaclient_setup_nis Invoked with domain=ooo.test nisdomain=None Dec 6 03:07:07 localhost systemd[1]: Reloading. Dec 6 03:07:08 localhost systemd-sysv-generator[37368]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:07:08 localhost systemd-rc-local-generator[37364]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:07:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:07:08 localhost systemd[1]: nis-domainname.service: Deactivated successfully. Dec 6 03:07:08 localhost systemd[1]: Stopped Read and set NIS domainname from /etc/sysconfig/network. Dec 6 03:07:08 localhost systemd[1]: Stopping Read and set NIS domainname from /etc/sysconfig/network... Dec 6 03:07:08 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Dec 6 03:07:08 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Dec 6 03:07:08 localhost python3[37397]: ansible-ansible.builtin.file Invoked with path=/tmp/tmphj5qf9pv state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:07:09 localhost python3[37413]: ansible-ipaclient_setup_krb5 Invoked with realm=OOO.TEST domain=ooo.test servers=['ipa.ooo.test'] kdc=ipa.ooo.test dnsok=False client_domain=ooo.test hostname=np0005548798.ooo.test sssd=True force=True Dec 6 03:07:10 localhost python3[37434]: ansible-ipaclient_setup_certmonger Invoked with realm=OOO.TEST hostname=np0005548798.ooo.test subject_base=O=OOO.TEST ca_enabled=True request_cert=False Dec 6 03:07:10 localhost python3[37453]: ansible-ansible.builtin.file Invoked with path=/etc/ipa/.dns_ccache state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:07:11 localhost python3[37469]: ansible-ansible.builtin.file Invoked with path=/tmp/tmphj5qf9pv state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:07:11 localhost python3[37485]: ansible-ansible.builtin.file Invoked with path=/tmp/tmphj5qf9pv.ipabkp state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:07:12 localhost python3[37501]: ansible-systemd Invoked with daemon_reload=True name=certmonger.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 03:07:12 localhost systemd[1]: Reloading. Dec 6 03:07:12 localhost systemd-sysv-generator[37532]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:07:12 localhost systemd-rc-local-generator[37528]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:07:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:07:12 localhost systemd[1]: Starting Certificate monitoring and PKI enrollment... Dec 6 03:07:12 localhost certmonger[37540]: 2025-12-06 08:07:12 [37540] Changing to root directory. Dec 6 03:07:12 localhost certmonger[37540]: 2025-12-06 08:07:12 [37540] Obtaining system lock. Dec 6 03:07:12 localhost systemd[1]: Started Certificate monitoring and PKI enrollment. Dec 6 03:07:12 localhost certmonger[37541]: 2025-12-06 08:07:12 [37541] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37542]: 2025-12-06 08:07:12 [37542] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37543]: 2025-12-06 08:07:12 [37543] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37544]: 2025-12-06 08:07:12 [37544] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37545]: 2025-12-06 08:07:12 [37545] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37546]: 2025-12-06 08:07:12 [37546] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37547]: 2025-12-06 08:07:12 [37547] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37548]: 2025-12-06 08:07:12 [37548] Running enrollment/cadata helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:12 localhost certmonger[37549]: 2025-12-06 08:07:12 [37549] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37550]: 2025-12-06 08:07:12 [37550] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37551]: 2025-12-06 08:07:12 [37551] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37552]: 2025-12-06 08:07:12 [37552] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37553]: 2025-12-06 08:07:12 [37553] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37554]: 2025-12-06 08:07:12 [37554] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37555]: 2025-12-06 08:07:12 [37555] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37556]: 2025-12-06 08:07:12 [37556] Running enrollment/cadata helper "/usr/libexec/certmonger/dogtag-ipa-renew-agent-submit". Dec 6 03:07:12 localhost certmonger[37557]: 2025-12-06 08:07:12 [37557] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37558]: 2025-12-06 08:07:12 [37558] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37559]: 2025-12-06 08:07:12 [37559] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37560]: 2025-12-06 08:07:12 [37560] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37561]: 2025-12-06 08:07:12 [37561] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37562]: 2025-12-06 08:07:12 [37562] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37563]: 2025-12-06 08:07:12 [37563] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37564]: 2025-12-06 08:07:12 [37564] Running enrollment/cadata helper "/usr/libexec/certmonger/local-submit". Dec 6 03:07:12 localhost certmonger[37540]: 2025-12-06 08:07:12 [37540] No hooks set for ca-pre-save command. Dec 6 03:07:12 localhost certmonger[37540]: 2025-12-06 08:07:12 [37540] No hooks set for ca-post-save command. Dec 6 03:07:12 localhost certmonger[37567]: 2025-12-06 08:07:12 [37567] Certificate "OOO.TEST IPA CA" valid for 631150992s. Dec 6 03:07:12 localhost certmonger[37540]: 2025-12-06 08:07:12 [37540] No hooks set for ca-pre-save command. Dec 6 03:07:12 localhost certmonger[37540]: 2025-12-06 08:07:12 [37540] No hooks set for ca-post-save command. Dec 6 03:07:12 localhost certmonger[37570]: 2025-12-06 08:07:12 [37570] Certificate "Local Signing Authority" valid for 31535999s. Dec 6 03:07:23 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 6 03:07:24 localhost python3[37620]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:07:25 localhost python3[37663]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008444.6669812-105396-139175910727491/source _original_basename=tmpjjdc443v follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:07:25 localhost python3[37693]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:07:27 localhost python3[37816]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:07:29 localhost python3[37937]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 6 03:07:31 localhost python3[37953]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:07:33 localhost python3[37970]: ansible-setup Invoked with gather_subset=['min'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 03:07:33 localhost python3[37990]: ansible-ansible.legacy.dnf Invoked with name=['python3-pyasn1', 'python3-cryptography', 'python3-dbus'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:07:36 localhost python3[38007]: ansible-ansible.legacy.dnf Invoked with name=['certmonger'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:07:39 localhost python3[38024]: ansible-file Invoked with name=/etc/certmonger//pre-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//pre-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:07:40 localhost python3[38040]: ansible-file Invoked with name=/etc/certmonger//post-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//post-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:07:40 localhost python3[38056]: ansible-ansible.legacy.systemd Invoked with name=certmonger state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:07:40 localhost systemd[1]: Reloading. Dec 6 03:07:40 localhost systemd-sysv-generator[38089]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:07:40 localhost systemd-rc-local-generator[38083]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:07:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:07:41 localhost python3[38110]: ansible-certificate_request Invoked with name=metrics_qdr dns=['np0005548798.internalapi.ooo.test'] principal=['metrics_qdr/np0005548798.internalapi.ooo.test@OOO.TEST'] directory=/etc/pki/tls key_size=2048 wait=True run_after=container_name=$(podman ps --format=\{\{.Names\}\} | grep metrics_qdr)#012service_crt="/etc/pki/tls/certs/metrics_qdr.crt"#012service_key="/etc/pki/tls/private/metrics_qdr.key#012# Copy the new cert from the mount-point to the real path#012podman exec "$container_name" cp "/var/lib/kolla/config_files/src-tls$service_crt" "$service_crt"#012# Copy the new key from the mount-point to the real path#012podman exec "$container_name" cp "/var/lib/kolla/config_files/src-tls$service_key" "$service_key"#012# Set appropriate permissions#012podman exec "$container_name" chown qdrouterd:qdrouterd "$service_crt"#012podman exec "$container_name" chown qdrouterd:qdrouterd "$service_key"#012# Trigger a container restart to read the new certificate#012podman restart "$container_name"#012 ca=ipa __header=##012# Ansible managed#012##012 provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None owner=None group=None run_before=None Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:41 localhost certmonger[37540]: 2025-12-06 08:07:41 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_REQ_SUBJECT" to "CN=np0005548798.internalapi.ooo.test" for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_REQ_HOSTNAME" to "np0005548798.internalapi.ooo.test Dec 6 03:07:42 localhost certmonger[38120]: " for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_REQ_PRINCIPAL" to "metrics_qdr/np0005548798.internalapi.ooo.test@OOO.TEST Dec 6 03:07:42 localhost certmonger[38120]: " for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_OPERATION" to "SUBMIT" for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_CSR" to "-----BEGIN CERTIFICATE REQUEST----- Dec 6 03:07:42 localhost certmonger[38120]: MIID4DCCAsgCAQAwLDEqMCgGA1UEAxMhbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBp Dec 6 03:07:42 localhost certmonger[38120]: Lm9vby50ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAvPQ6P0tf Dec 6 03:07:42 localhost certmonger[38120]: BEQ8CT5gL9oapmDFh3t7p4qHXYtSNDvL1GPv/b/8X7tHKd3IyWjAmeUIbhE9UK19 Dec 6 03:07:42 localhost certmonger[38120]: L/L7tMocKh5G+ttWxSc6HUwR3Bl3AHJzNuguYm9axTTE/SotC2JfjAcqW2C8litF Dec 6 03:07:42 localhost certmonger[38120]: TV92phXIHVGg4bxJbJqHQoOcAOwB8m+Ml+0qBBcnLTCXaI/BovTRPAmRMw7HGeE3 Dec 6 03:07:42 localhost certmonger[38120]: 7EHZdpTwHKUscIig9qmj7uTOH1y5EpJC/EUdKq03LdCF+1tBvGNPl42EIHQyLcnT Dec 6 03:07:42 localhost certmonger[38120]: H3IC1qr/4/eiWYyXmxlUMsglfuWjZV+ukhR7Q2nimsQv1mE4nqgjE+eITKhgC/EZ Dec 6 03:07:42 localhost certmonger[38120]: gnPBXT5NGtmq/wIDAQABoIIBbTArBgkqhkiG9w0BCRQxHh4cADIAMAAyADUAMQAy Dec 6 03:07:42 localhost certmonger[38120]: ADAANgAwADgAMAA3ADQAMTCCATwGCSqGSIb3DQEJDjGCAS0wggEpMAsGA1UdDwQE Dec 6 03:07:42 localhost certmonger[38120]: AwIFoDCBzQYDVR0RBIHFMIHCgiFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:07:42 localhost certmonger[38120]: LnRlc3SgRgYKKwYBBAGCNxQCA6A4DDZtZXRyaWNzX3Fkci9ucDAwMDU1NDg3OTgu Dec 6 03:07:42 localhost certmonger[38120]: aW50ZXJuYWxhcGkub29vLnRlc3RAT09PLlRFU1SgVQYGKwYBBQICoEswSaAKGwhP Dec 6 03:07:42 localhost certmonger[38120]: T08uVEVTVKE7MDmgAwIBAaEyMDAbC21ldHJpY3NfcWRyGyFucDAwMDU1NDg3OTgu Dec 6 03:07:42 localhost certmonger[38120]: aW50ZXJuYWxhcGkub29vLnRlc3QwHQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUF Dec 6 03:07:42 localhost certmonger[38120]: BwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFD2WkB9rMUQ2RfwuH5f/16qTnsiq Dec 6 03:07:42 localhost certmonger[38120]: MA0GCSqGSIb3DQEBCwUAA4IBAQALwEJwgUxrQVMAV3ecMkYaBbjEXhK1M6BdCUmz Dec 6 03:07:42 localhost certmonger[38120]: CNCswdpFJ/lqx8436yK8H6uKwVlPpVz9rXwpDWBrmUlkJuCB7ikguDgUQ1ItQexi Dec 6 03:07:42 localhost certmonger[38120]: 5TskmYOD2S8eDn6OLCXA25XNaWWCJLehpo+M+QCL75o9PGN0IIO4/f4i6YpJwScB Dec 6 03:07:42 localhost certmonger[38120]: b2qDFFnnoCa8Zh3QIUFf2/zntJb3HQroT74KsQZm+wgp/b8veFyMd2c50xGknD6c Dec 6 03:07:42 localhost certmonger[38120]: YMrHPRf49SqsCol9EXfW9V3zeyAPVqLJ7k6HldahPU8EsWvxvLNTimnYNav6XrDh Dec 6 03:07:42 localhost certmonger[38120]: N2gZQxGn3kbjpURdm35d1qZ+FTWKqUR6gR7soaJl1prQ6rV8 Dec 6 03:07:42 localhost certmonger[38120]: -----END CERTIFICATE REQUEST----- Dec 6 03:07:42 localhost certmonger[38120]: " for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_SPKAC" to "MIICQDCCASgwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC89Do/S18ERDwJPmAv2hqmYMWHe3unioddi1I0O8vUY+/9v/xfu0cp3cjJaMCZ5QhuET1QrX0v8vu0yhwqHkb621bFJzodTBHcGXcAcnM26C5ib1rFNMT9Ki0LYl+MBypbYLyWK0VNX3amFcgdUaDhvElsmodCg5wA7AHyb4yX7SoEFyctMJdoj8Gi9NE8CZEzDscZ4TfsQdl2lPAcpSxwiKD2qaPu5M4fXLkSkkL8RR0qrTct0IX7W0G8Y0+XjYQgdDItydMfcgLWqv/j96JZjJebGVQyyCV+5aNlX66SFHtDaeKaxC/WYTieqCMT54hMqGAL8RmCc8FdPk0a2ar/AgMBAAEWADANBgkqhkiG9w0BAQsFAAOCAQEAFOYzHfcfvlzwfKyBFMC5OaibQyCW0woJpN9GPLJBhBeoJN7p49kLMKwImCXmxPfEA+aqrA+mJXgZrR/eK4b63c1Tzrsd+cQUlMMNzGwmgWyYjM0ctaOc13W3kRIi9YYo2sq5ESgl2AADLwrzKl7eoSMgcH8rvpxI7Q2OtlfjAtaIuFK6ODxGWUgf9FORPyzO0yYvNbhRw+LqYsBI4c8xUXrFprczOI4Fqjnh6kx0ZzMekVLjn3UTmwTeAG0IvC/z0pCZY+QL1Czrz25oLOHKyJF36nIg4hGNHNjPd6t3wFLzExClRhmqfnqX7FbqoiUQ7aI0Otpyif2f6eX5SgXk1A==" for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_SPKI" to "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAvPQ6P0tfBEQ8CT5gL9oapmDFh3t7p4qHXYtSNDvL1GPv/b/8X7tHKd3IyWjAmeUIbhE9UK19L/L7tMocKh5G+ttWxSc6HUwR3Bl3AHJzNuguYm9axTTE/SotC2JfjAcqW2C8litFTV92phXIHVGg4bxJbJqHQoOcAOwB8m+Ml+0qBBcnLTCXaI/BovTRPAmRMw7HGeE37EHZdpTwHKUscIig9qmj7uTOH1y5EpJC/EUdKq03LdCF+1tBvGNPl42EIHQyLcnTH3IC1qr/4/eiWYyXmxlUMsglfuWjZV+ukhR7Q2nimsQv1mE4nqgjE+eITKhgC/EZgnPBXT5NGtmq/wIDAQAB" for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_LOCAL_CA_DIR" to "/var/lib/certmonger/local" for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_KEY_TYPE" to "RSA" for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Setting "CERTMONGER_CA_NICKNAME" to "IPA" for child. Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Redirecting stdin to /dev/null, leaving stdout and stderr open for child "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:42 localhost certmonger[38120]: 2025-12-06 08:07:42 [38120] Running enrollment helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[38120]: Submitting request to "https://ipa.ooo.test/ipa/json". Dec 6 03:07:42 localhost certmonger[38120]: Certificate: "MIIFYzCCA8ugAwIBAgIBDTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08uVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4MDc0MloXDTI3MTIwNzA4MDc0MlowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNVBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBALz0Oj9LXwREPAk+YC/aGqZgxYd7e6eKh12LUjQ7y9Rj7/2//F+7RyndyMlowJnlCG4RPVCtfS/y+7TKHCoeRvrbVsUnOh1MEdwZdwByczboLmJvWsU0xP0qLQtiX4wHKltgvJYrRU1fdqYVyB1RoOG8SWyah0KDnADsAfJvjJftKgQXJy0wl2iPwaL00TwJkTMOxxnhN+xB2XaU8BylLHCIoPapo+7kzh9cuRKSQvxFHSqtNy3QhftbQbxjT5eNhCB0Mi3J0x9yAtaq/+P3olmMl5sZVDLIJX7lo2VfrpIUe0Np4prEL9ZhOJ6oIxPniEyoYAvxGYJzwV0+TRrZqv8CAwEAAaOCAfQwggHwMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEBBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3NwMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwcwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3JsL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFM/w80ZeJGTL5FdAvxoCWzuwzfHRMIHNBgNVHREEgcUwgcKCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdKBGBgorBgEEAYI3FAIDoDgMNm1ldHJpY3NfcWRyL25wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBVBgYrBgEFAgKgSzBJoAobCE9PTy5URVNUoTswOaADAgEBoTIwMBsLbWV0cmljc19xZHIbIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAWnr2zuNUOrmM8+rksYXJtL9a4LhYI9ZfsWPGBeQ5hLo5lWpJLpXXn40MZrWTMUCYGodcoZT7QK/IX2EI3xVDhdvOL0x2iEkRQZCIjDxqWrmz/UNpd6yPMIIGE90Cv4QYkcqG43Fe2giov8o/VTOpFbE6FGi1ds9XXoUX0gBZSqaUnLYrDcstAm5dMkiE450wAcVUPlq2q+gnj9v8k182CFGlYX4+GFbEJduDTqujN0wyysqQjCsgdLUWEqOjrGwE19rUbGUbkg3V77vzoe+pIukHbpDcC9nbp3HYnj8YxXmTGQ0pgJjs7OaIZftdv7U9Zg9g2zf4NAvIHuUZUs2+3luvGztUocUSCgtY7QrxyyWcW2FSJ7N2o4HiYiq+1imdp/+YRFQIJ9sJdXcNxBrdJ4Y11jpDXThHFYxJx5/wtFzsNMF2GxD9w0RXap/iGE1plqeuOBVexiI/bimsCwk32TkDLHsf2AC/hHBAHeL252IKcTA4Sf8nzwjEWhwHa6DX" Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Certificate submission still ongoing. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Certificate submission attempt complete. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Child status = 0. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Child output: Dec 6 03:07:42 localhost certmonger[37540]: "-----BEGIN CERTIFICATE----- Dec 6 03:07:42 localhost certmonger[37540]: MIIFYzCCA8ugAwIBAgIBDTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:07:42 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:07:42 localhost certmonger[37540]: MDc0MloXDTI3MTIwNzA4MDc0MlowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:07:42 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:07:42 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBALz0Oj9LXwREPAk+YC/aGqZgxYd7e6eKh12L Dec 6 03:07:42 localhost certmonger[37540]: UjQ7y9Rj7/2//F+7RyndyMlowJnlCG4RPVCtfS/y+7TKHCoeRvrbVsUnOh1MEdwZ Dec 6 03:07:42 localhost certmonger[37540]: dwByczboLmJvWsU0xP0qLQtiX4wHKltgvJYrRU1fdqYVyB1RoOG8SWyah0KDnADs Dec 6 03:07:42 localhost certmonger[37540]: AfJvjJftKgQXJy0wl2iPwaL00TwJkTMOxxnhN+xB2XaU8BylLHCIoPapo+7kzh9c Dec 6 03:07:42 localhost certmonger[37540]: uRKSQvxFHSqtNy3QhftbQbxjT5eNhCB0Mi3J0x9yAtaq/+P3olmMl5sZVDLIJX7l Dec 6 03:07:42 localhost certmonger[37540]: o2VfrpIUe0Np4prEL9ZhOJ6oIxPniEyoYAvxGYJzwV0+TRrZqv8CAwEAAaOCAfQw Dec 6 03:07:42 localhost certmonger[37540]: ggHwMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:07:42 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:07:42 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:07:42 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:07:42 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:07:42 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFM/w80ZeJGTL5FdAvxoCWzuw Dec 6 03:07:42 localhost certmonger[37540]: zfHRMIHNBgNVHREEgcUwgcKCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:07:42 localhost certmonger[37540]: dGVzdKBGBgorBgEEAYI3FAIDoDgMNm1ldHJpY3NfcWRyL25wMDAwNTU0ODc5OC5p Dec 6 03:07:42 localhost certmonger[37540]: bnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBVBgYrBgEFAgKgSzBJoAobCE9P Dec 6 03:07:42 localhost certmonger[37540]: Ty5URVNUoTswOaADAgEBoTIwMBsLbWV0cmljc19xZHIbIW5wMDAwNTU0ODc5OC5p Dec 6 03:07:42 localhost certmonger[37540]: bnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAWnr2zuNUOrmM Dec 6 03:07:42 localhost certmonger[37540]: 8+rksYXJtL9a4LhYI9ZfsWPGBeQ5hLo5lWpJLpXXn40MZrWTMUCYGodcoZT7QK/I Dec 6 03:07:42 localhost certmonger[37540]: X2EI3xVDhdvOL0x2iEkRQZCIjDxqWrmz/UNpd6yPMIIGE90Cv4QYkcqG43Fe2gio Dec 6 03:07:42 localhost certmonger[37540]: v8o/VTOpFbE6FGi1ds9XXoUX0gBZSqaUnLYrDcstAm5dMkiE450wAcVUPlq2q+gn Dec 6 03:07:42 localhost certmonger[37540]: j9v8k182CFGlYX4+GFbEJduDTqujN0wyysqQjCsgdLUWEqOjrGwE19rUbGUbkg3V Dec 6 03:07:42 localhost certmonger[37540]: 77vzoe+pIukHbpDcC9nbp3HYnj8YxXmTGQ0pgJjs7OaIZftdv7U9Zg9g2zf4NAvI Dec 6 03:07:42 localhost certmonger[37540]: HuUZUs2+3luvGztUocUSCgtY7QrxyyWcW2FSJ7N2o4HiYiq+1imdp/+YRFQIJ9sJ Dec 6 03:07:42 localhost certmonger[37540]: dXcNxBrdJ4Y11jpDXThHFYxJx5/wtFzsNMF2GxD9w0RXap/iGE1plqeuOBVexiI/ Dec 6 03:07:42 localhost certmonger[37540]: bimsCwk32TkDLHsf2AC/hHBAHeL252IKcTA4Sf8nzwjEWhwHa6DX Dec 6 03:07:42 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:07:42 localhost certmonger[37540]: " Dec 6 03:07:42 localhost certmonger[38122]: 2025-12-06 08:07:42 [38122] Postprocessing output "-----BEGIN CERTIFICATE----- Dec 6 03:07:42 localhost certmonger[38122]: MIIFYzCCA8ugAwIBAgIBDTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:07:42 localhost certmonger[38122]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:07:42 localhost certmonger[38122]: MDc0MloXDTI3MTIwNzA4MDc0MlowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:07:42 localhost certmonger[38122]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:07:42 localhost certmonger[38122]: hvcNAQEBBQADggEPADCCAQoCggEBALz0Oj9LXwREPAk+YC/aGqZgxYd7e6eKh12L Dec 6 03:07:42 localhost certmonger[38122]: UjQ7y9Rj7/2//F+7RyndyMlowJnlCG4RPVCtfS/y+7TKHCoeRvrbVsUnOh1MEdwZ Dec 6 03:07:42 localhost certmonger[38122]: dwByczboLmJvWsU0xP0qLQtiX4wHKltgvJYrRU1fdqYVyB1RoOG8SWyah0KDnADs Dec 6 03:07:42 localhost certmonger[38122]: AfJvjJftKgQXJy0wl2iPwaL00TwJkTMOxxnhN+xB2XaU8BylLHCIoPapo+7kzh9c Dec 6 03:07:42 localhost certmonger[38122]: uRKSQvxFHSqtNy3QhftbQbxjT5eNhCB0Mi3J0x9yAtaq/+P3olmMl5sZVDLIJX7l Dec 6 03:07:42 localhost certmonger[38122]: o2VfrpIUe0Np4prEL9ZhOJ6oIxPniEyoYAvxGYJzwV0+TRrZqv8CAwEAAaOCAfQw Dec 6 03:07:42 localhost certmonger[38122]: ggHwMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:07:42 localhost certmonger[38122]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:07:42 localhost certmonger[38122]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:07:42 localhost certmonger[38122]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:07:42 localhost certmonger[38122]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:07:42 localhost certmonger[38122]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFM/w80ZeJGTL5FdAvxoCWzuw Dec 6 03:07:42 localhost certmonger[38122]: zfHRMIHNBgNVHREEgcUwgcKCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:07:42 localhost certmonger[38122]: dGVzdKBGBgorBgEEAYI3FAIDoDgMNm1ldHJpY3NfcWRyL25wMDAwNTU0ODc5OC5p Dec 6 03:07:42 localhost certmonger[38122]: bnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBVBgYrBgEFAgKgSzBJoAobCE9P Dec 6 03:07:42 localhost certmonger[38122]: Ty5URVNUoTswOaADAgEBoTIwMBsLbWV0cmljc19xZHIbIW5wMDAwNTU0ODc5OC5p Dec 6 03:07:42 localhost certmonger[38122]: bnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAWnr2zuNUOrmM Dec 6 03:07:42 localhost certmonger[38122]: 8+rksYXJtL9a4LhYI9ZfsWPGBeQ5hLo5lWpJLpXXn40MZrWTMUCYGodcoZT7QK/I Dec 6 03:07:42 localhost certmonger[38122]: X2EI3xVDhdvOL0x2iEkRQZCIjDxqWrmz/UNpd6yPMIIGE90Cv4QYkcqG43Fe2gio Dec 6 03:07:42 localhost certmonger[38122]: v8o/VTOpFbE6FGi1ds9XXoUX0gBZSqaUnLYrDcstAm5dMkiE450wAcVUPlq2q+gn Dec 6 03:07:42 localhost certmonger[38122]: j9v8k182CFGlYX4+GFbEJduDTqujN0wyysqQjCsgdLUWEqOjrGwE19rUbGUbkg3V Dec 6 03:07:42 localhost certmonger[38122]: 77vzoe+pIukHbpDcC9nbp3HYnj8YxXmTGQ0pgJjs7OaIZftdv7U9Zg9g2zf4NAvI Dec 6 03:07:42 localhost certmonger[38122]: HuUZUs2+3luvGztUocUSCgtY7QrxyyWcW2FSJ7N2o4HiYiq+1imdp/+YRFQIJ9sJ Dec 6 03:07:42 localhost certmonger[38122]: dXcNxBrdJ4Y11jpDXThHFYxJx5/wtFzsNMF2GxD9w0RXap/iGE1plqeuOBVexiI/ Dec 6 03:07:42 localhost certmonger[38122]: bimsCwk32TkDLHsf2AC/hHBAHeL252IKcTA4Sf8nzwjEWhwHa6DX Dec 6 03:07:42 localhost certmonger[38122]: -----END CERTIFICATE----- Dec 6 03:07:42 localhost certmonger[38122]: ". Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Certificate submission still ongoing. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Certificate submission postprocessing complete. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Child status = 0. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Child output: Dec 6 03:07:42 localhost certmonger[37540]: "{"certificate":"-----BEGIN CERTIFICATE-----\nMIIFYzCCA8ugAwIBAgIBDTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u\nVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4\nMDc0MloXDTI3MTIwNzA4MDc0MlowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV\nBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI\nhvcNAQEBBQADggEPADCCAQoCggEBALz0Oj9LXwREPAk+YC/aGqZgxYd7e6eKh12L\nUjQ7y9Rj7/2//F+7RyndyMlowJnlCG4RPVCtfS/y+7TKHCoeRvrbVsUnOh1MEdwZ\ndwByczboLmJvWsU0xP0qLQtiX4wHKltgvJYrRU1fdqYVyB1RoOG8SWyah0KDnADs\nAfJvjJftKgQXJy0wl2iPwaL00TwJkTMOxxnhN+xB2XaU8BylLHCIoPapo+7kzh9c\nuRKSQvxFHSqtNy3QhftbQbxjT5eNhCB0Mi3J0x9yAtaq/+P3olmMl5sZVDLIJX7l\no2VfrpIUe0Np4prEL9ZhOJ6oIxPniEyoYAvxGYJzwV0+TRrZqv8CAwEAAaOCAfQw\nggHwMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB\nBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw\nMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw\ncwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js\nL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD\nZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFM/w80ZeJGTL5FdAvxoCWzuw\nzfHRMIHNBgNVHREEgcUwgcKCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u\ndGVzdKBGBgorBgEEAYI3FAIDoDgMNm1ldHJpY3NfcWRyL25wMDAwNTU0ODc5OC5p\nbnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBVBgYrBgEFAgKgSzBJoAobCE9P\nTy5URVNUoTswOaADAgEBoTIwMBsLbWV0cmljc19xZHIbIW5wMDAwNTU0ODc5OC5p\nbnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAWnr2zuNUOrmM\n8+rksYXJtL9a4LhYI9ZfsWPGBeQ5hLo5lWpJLpXXn40MZrWTMUCYGodcoZT7QK/I\nX2EI3xVDhdvOL0x2iEkRQZCIjDxqWrmz/UNpd6yPMIIGE90Cv4QYkcqG43Fe2gio\nv8o/VTOpFbE6FGi1ds9XXoUX0gBZSqaUnLYrDcstAm5dMkiE450wAcVUPlq2q+gn\nj9v8k182CFGlYX4+GFbEJduDTqujN0wyysqQjCsgdLUWEqOjrGwE19rUbGUbkg3V\n77vzoe+pIukHbpDcC9nbp3HYnj8YxXmTGQ0pgJjs7OaIZftdv7U9Zg9g2zf4NAvI\nHuUZUs2+3luvGztUocUSCgtY7QrxyyWcW2FSJ7N2o4HiYiq+1imdp/+YRFQIJ9sJ\ndXcNxBrdJ4Y11jpDXThHFYxJx5/wtFzsNMF2GxD9w0RXap/iGE1plqeuOBVexiI/\nbimsCwk32TkDLHsf2AC/hHBAHeL252IKcTA4Sf8nzwjEWhwHa6DX\n-----END CERTIFICATE-----\n","key_checked":true} Dec 6 03:07:42 localhost certmonger[37540]: " Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Issued certificate is "-----BEGIN CERTIFICATE----- Dec 6 03:07:42 localhost certmonger[37540]: MIIFYzCCA8ugAwIBAgIBDTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:07:42 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:07:42 localhost certmonger[37540]: MDc0MloXDTI3MTIwNzA4MDc0MlowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:07:42 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:07:42 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBALz0Oj9LXwREPAk+YC/aGqZgxYd7e6eKh12L Dec 6 03:07:42 localhost certmonger[37540]: UjQ7y9Rj7/2//F+7RyndyMlowJnlCG4RPVCtfS/y+7TKHCoeRvrbVsUnOh1MEdwZ Dec 6 03:07:42 localhost certmonger[37540]: dwByczboLmJvWsU0xP0qLQtiX4wHKltgvJYrRU1fdqYVyB1RoOG8SWyah0KDnADs Dec 6 03:07:42 localhost certmonger[37540]: AfJvjJftKgQXJy0wl2iPwaL00TwJkTMOxxnhN+xB2XaU8BylLHCIoPapo+7kzh9c Dec 6 03:07:42 localhost certmonger[37540]: uRKSQvxFHSqtNy3QhftbQbxjT5eNhCB0Mi3J0x9yAtaq/+P3olmMl5sZVDLIJX7l Dec 6 03:07:42 localhost certmonger[37540]: o2VfrpIUe0Np4prEL9ZhOJ6oIxPniEyoYAvxGYJzwV0+TRrZqv8CAwEAAaOCAfQw Dec 6 03:07:42 localhost certmonger[37540]: ggHwMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:07:42 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:07:42 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:07:42 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:07:42 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:07:42 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFM/w80ZeJGTL5FdAvxoCWzuw Dec 6 03:07:42 localhost certmonger[37540]: zfHRMIHNBgNVHREEgcUwgcKCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:07:42 localhost certmonger[37540]: dGVzdKBGBgorBgEEAYI3FAIDoDgMNm1ldHJpY3NfcWRyL25wMDAwNTU0ODc5OC5p Dec 6 03:07:42 localhost certmonger[37540]: bnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBVBgYrBgEFAgKgSzBJoAobCE9P Dec 6 03:07:42 localhost certmonger[37540]: Ty5URVNUoTswOaADAgEBoTIwMBsLbWV0cmljc19xZHIbIW5wMDAwNTU0ODc5OC5p Dec 6 03:07:42 localhost certmonger[37540]: bnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAWnr2zuNUOrmM Dec 6 03:07:42 localhost certmonger[37540]: 8+rksYXJtL9a4LhYI9ZfsWPGBeQ5hLo5lWpJLpXXn40MZrWTMUCYGodcoZT7QK/I Dec 6 03:07:42 localhost certmonger[37540]: X2EI3xVDhdvOL0x2iEkRQZCIjDxqWrmz/UNpd6yPMIIGE90Cv4QYkcqG43Fe2gio Dec 6 03:07:42 localhost certmonger[37540]: v8o/VTOpFbE6FGi1ds9XXoUX0gBZSqaUnLYrDcstAm5dMkiE450wAcVUPlq2q+gn Dec 6 03:07:42 localhost certmonger[37540]: j9v8k182CFGlYX4+GFbEJduDTqujN0wyysqQjCsgdLUWEqOjrGwE19rUbGUbkg3V Dec 6 03:07:42 localhost certmonger[37540]: 77vzoe+pIukHbpDcC9nbp3HYnj8YxXmTGQ0pgJjs7OaIZftdv7U9Zg9g2zf4NAvI Dec 6 03:07:42 localhost certmonger[37540]: HuUZUs2+3luvGztUocUSCgtY7QrxyyWcW2FSJ7N2o4HiYiq+1imdp/+YRFQIJ9sJ Dec 6 03:07:42 localhost certmonger[37540]: dXcNxBrdJ4Y11jpDXThHFYxJx5/wtFzsNMF2GxD9w0RXap/iGE1plqeuOBVexiI/ Dec 6 03:07:42 localhost certmonger[37540]: bimsCwk32TkDLHsf2AC/hHBAHeL252IKcTA4Sf8nzwjEWhwHa6DX Dec 6 03:07:42 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:07:42 localhost certmonger[37540]: ". Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Certificate issued (0 chain certificates, 0 roots). Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] No hooks set for pre-save command. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:42 localhost certmonger[38140]: Certificate in file "/etc/pki/tls/certs/metrics_qdr.crt" issued by CA and saved. Dec 6 03:07:42 localhost certmonger[37540]: 2025-12-06 08:07:42 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 03:07:43 localhost python3[38156]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:07:46 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 03:07:46 localhost dbus-broker-launch[18427]: Noticed file-system modification, trigger reload. Dec 6 03:07:46 localhost dbus-broker-launch[18427]: Service file '/usr/share/dbus-1/services/certmonger.service' is not named after the D-Bus name 'org.fedorahosted.certmonger'. Dec 6 03:07:46 localhost dbus-broker-launch[18427]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Dec 6 03:07:46 localhost dbus-broker-launch[18427]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Dec 6 03:07:46 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 03:07:47 localhost systemd[1]: Reexecuting. Dec 6 03:07:47 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Dec 6 03:07:47 localhost systemd[1]: Detected virtualization kvm. Dec 6 03:07:47 localhost systemd[1]: Detected architecture x86-64. Dec 6 03:07:47 localhost systemd-sysv-generator[38214]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:07:47 localhost systemd-rc-local-generator[38211]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:07:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:07:55 localhost kernel: SELinux: Converting 2730 SID table entries... Dec 6 03:07:55 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 03:07:55 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 03:07:55 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 03:07:55 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 03:07:55 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 03:07:55 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 03:07:55 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 03:07:55 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 03:07:55 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=15 res=1 Dec 6 03:07:55 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 03:07:56 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:07:56 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 03:07:56 localhost systemd[1]: Reloading. Dec 6 03:07:56 localhost systemd-rc-local-generator[38307]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:07:56 localhost systemd-sysv-generator[38310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:07:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:07:56 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:07:56 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 03:07:56 localhost systemd-journald[619]: Journal stopped Dec 6 03:07:56 localhost systemd-journald[619]: Received SIGTERM from PID 1 (systemd). Dec 6 03:07:56 localhost systemd[1]: Stopping Journal Service... Dec 6 03:07:56 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Dec 6 03:07:56 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Dec 6 03:07:56 localhost systemd[1]: Stopped Journal Service. Dec 6 03:07:56 localhost systemd[1]: systemd-journald.service: Consumed 1.325s CPU time. Dec 6 03:07:56 localhost systemd[1]: Starting Journal Service... Dec 6 03:07:56 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 6 03:07:56 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Dec 6 03:07:56 localhost systemd[1]: systemd-udevd.service: Consumed 2.155s CPU time. Dec 6 03:07:56 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Dec 6 03:07:56 localhost systemd-journald[38691]: Journal started Dec 6 03:07:56 localhost systemd-journald[38691]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 13.8M, max 314.7M, 300.9M free. Dec 6 03:07:56 localhost systemd[1]: Started Journal Service. Dec 6 03:07:56 localhost systemd-udevd[38701]: Using default interface naming scheme 'rhel-9.0'. Dec 6 03:07:56 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Dec 6 03:07:57 localhost systemd[1]: Reloading. Dec 6 03:07:57 localhost systemd-rc-local-generator[39297]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:07:57 localhost systemd-sysv-generator[39300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:07:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:07:57 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 03:07:57 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 03:07:57 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 03:07:57 localhost systemd[1]: run-r85ae8213cc2f4bf283b021e85be6f58a.service: Deactivated successfully. Dec 6 03:07:57 localhost systemd[1]: run-ra46248f206c747daaa82c2e348f245d3.service: Deactivated successfully. Dec 6 03:07:59 localhost python3[39656]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Dec 6 03:07:59 localhost python3[39675]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:08:00 localhost python3[39693]: ansible-file Invoked with path=/etc/pki/libvirt serole=object_r setype=cert_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None selevel=None attributes=None Dec 6 03:08:00 localhost python3[39709]: ansible-file Invoked with path=/etc/pki/libvirt/private serole=object_r setype=cert_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None selevel=None attributes=None Dec 6 03:08:00 localhost python3[39725]: ansible-file Invoked with path=/etc/pki/qemu serole=object_r setype=cert_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None selevel=None attributes=None Dec 6 03:08:01 localhost python3[39741]: ansible-setup Invoked with gather_subset=['min'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 03:08:02 localhost python3[39761]: ansible-ansible.legacy.dnf Invoked with name=['python3-pyasn1', 'python3-cryptography', 'python3-dbus'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:08:05 localhost python3[39778]: ansible-ansible.legacy.dnf Invoked with name=['certmonger'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:08:08 localhost python3[39795]: ansible-file Invoked with name=/etc/certmonger//pre-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//pre-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:08:08 localhost python3[39811]: ansible-file Invoked with name=/etc/certmonger//post-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//post-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:08:08 localhost python3[39827]: ansible-ansible.legacy.systemd Invoked with name=certmonger state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:08:09 localhost python3[39845]: ansible-certificate_request Invoked with name=libvirt-server-cert dns=['np0005548798.internalapi.ooo.test'] principal=['libvirt/np0005548798.internalapi.ooo.test@OOO.TEST'] directory=/etc/pki/tls key_size=2048 wait=True run_after=# Copy cert and key to libvirt dirs#012cp /etc/ipa/ca.crt /etc/pki/CA/cacert.pem#012chown root:root /etc/pki/CA/cacert.pem#012chmod 644 /etc/pki/CA/cacert.pem#012cp /etc/pki/tls/certs/libvirt-server-cert.crt /etc/pki/libvirt/servercert.pem#012cp /etc/pki/tls/private/libvirt-server-cert.key /etc/pki/libvirt/private/serverkey.pem#012podman exec nova_virtproxyd virt-admin server-update-tls virtproxyd || systemctl reload tripleo_nova_virtproxyd#012 ca=ipa __header=##012# Ansible managed#012##012 provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None owner=None group=None run_before=None Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:09 localhost certmonger[37540]: 2025-12-06 08:08:09 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_REQ_SUBJECT" to "CN=np0005548798.internalapi.ooo.test" for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_REQ_HOSTNAME" to "np0005548798.internalapi.ooo.test Dec 6 03:08:10 localhost certmonger[39855]: " for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_REQ_PRINCIPAL" to "libvirt/np0005548798.internalapi.ooo.test@OOO.TEST Dec 6 03:08:10 localhost certmonger[39855]: " for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_OPERATION" to "SUBMIT" for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_CSR" to "-----BEGIN CERTIFICATE REQUEST----- Dec 6 03:08:10 localhost certmonger[39855]: MIID2DCCAsACAQAwLDEqMCgGA1UEAxMhbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBp Dec 6 03:08:10 localhost certmonger[39855]: Lm9vby50ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAoD0bafxg Dec 6 03:08:10 localhost certmonger[39855]: Zi+i18Wn6Qi1BCY5rsbIXPyVCkzv11CHrxm4Gtn16BINvlGRRk1ouTQZLZXk+97B Dec 6 03:08:10 localhost certmonger[39855]: vGAiCnrEkyf5iyVP3wh1VYSAN3ZQMd9Ui4Ejh2xAM3CIXJgu606f3g+ESjVQEhb9 Dec 6 03:08:10 localhost certmonger[39855]: gnhVjGOWpY4IG3wffcQ58FfjyC0NJa8HfzTn4UEJq+K/hVYYQT7kM7cCoEiVS88B Dec 6 03:08:10 localhost certmonger[39855]: rTGyl+G2tVJ70gKziwt+uXtBmB9F3F1rcsL+MLreGJ93QX6jZVG78B8+iQ7+kIrj Dec 6 03:08:10 localhost certmonger[39855]: ZB7jJLCwHQnjVcojgpRTQlOyXKn9vvHvROvXlxJiUhqsId3g7AfKOAP0L69lOzMj Dec 6 03:08:10 localhost certmonger[39855]: EOWrr2DaCVXpAwIDAQABoIIBZTArBgkqhkiG9w0BCRQxHh4cADIAMAAyADUAMQAy Dec 6 03:08:10 localhost certmonger[39855]: ADAANgAwADgAMAA4ADAAOTCCATQGCSqGSIb3DQEJDjGCASUwggEhMAsGA1UdDwQE Dec 6 03:08:10 localhost certmonger[39855]: AwIFoDCBxQYDVR0RBIG9MIG6giFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:10 localhost certmonger[39855]: LnRlc3SgQgYKKwYBBAGCNxQCA6A0DDJsaWJ2aXJ0L25wMDAwNTU0ODc5OC5pbnRl Dec 6 03:08:10 localhost certmonger[39855]: cm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBRBgYrBgEFAgKgRzBFoAobCE9PTy5U Dec 6 03:08:10 localhost certmonger[39855]: RVNUoTcwNaADAgEBoS4wLBsHbGlidmlydBshbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:10 localhost certmonger[39855]: YXBpLm9vby50ZXN0MB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAMBgNV Dec 6 03:08:10 localhost certmonger[39855]: HRMBAf8EAjAAMB0GA1UdDgQWBBTplpL2Sn5Z+SPc6mnRckG1QMr5RDANBgkqhkiG Dec 6 03:08:10 localhost certmonger[39855]: 9w0BAQsFAAOCAQEAfNpZotcV3F2s13/BHNfExwRMc+n5WOxmDGAwklkUuwzb+kQs Dec 6 03:08:10 localhost certmonger[39855]: LvVuAdvERodd5R0N6JX7jsMCbPX5QA4sYKGzZvUzM7BM3BySHvCyR7eWykjyKEOn Dec 6 03:08:10 localhost certmonger[39855]: i4kTQAUpfeQE33YwH4G1L+0C8MthMWH7m2geBonelEcqz/Vl1C8EM1GWEhdPeCRj Dec 6 03:08:10 localhost certmonger[39855]: QySTgLkbOrVa9VQZqmCoSMxiqM83tvzWBU60gDjRtRKPk5YuPak3sQR/u/YcoZeN Dec 6 03:08:10 localhost certmonger[39855]: aVaS+gcDCoNGV/kyRoQh7VOR9EnGX5/sx5R9Izdcfgm1C3YcZMrNqv8l5tiWTPdC Dec 6 03:08:10 localhost certmonger[39855]: T7b3RFYPnywRNu5nYCSbZ4NrUaFGjqCxT0BPeA== Dec 6 03:08:10 localhost certmonger[39855]: -----END CERTIFICATE REQUEST----- Dec 6 03:08:10 localhost certmonger[39855]: " for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_SPKAC" to "MIICQDCCASgwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCgPRtp/GBmL6LXxafpCLUEJjmuxshc/JUKTO/XUIevGbga2fXoEg2+UZFGTWi5NBktleT73sG8YCIKesSTJ/mLJU/fCHVVhIA3dlAx31SLgSOHbEAzcIhcmC7rTp/eD4RKNVASFv2CeFWMY5aljggbfB99xDnwV+PILQ0lrwd/NOfhQQmr4r+FVhhBPuQztwKgSJVLzwGtMbKX4ba1UnvSArOLC365e0GYH0XcXWtywv4wut4Yn3dBfqNlUbvwHz6JDv6QiuNkHuMksLAdCeNVyiOClFNCU7Jcqf2+8e9E69eXEmJSGqwh3eDsB8o4A/Qvr2U7MyMQ5auvYNoJVekDAgMBAAEWADANBgkqhkiG9w0BAQsFAAOCAQEAnX/KSZ+EoS5g/V0PVhTH/twzSbU9UBzPuwKXlhFdhkyYDfNIcDw1y0pbCPaznASY6MJOKny00Ng2mxTWqsV0C9PFcdF4FiKa1MAMJEH7tibwXrfDFBv2EQeqbcgCSMeUNt6Br9xd7evmpEfNnKkjdTAdwTn6errY2IwYIErCwZsraf+qb1d4AXAlHpmGJnIeiUmIuQDbGhUblWH2Q+QAR8DSMcP4sbuXPf9tfJeZM8uDvFMrU3alC96SESFa50vYb0eSBD71AaGkiVTYmFLub2rZ6NojG6Pf//kdmHBK08eHM/abo+IQVJI7B0XmNzU0bur+XC2//tspbs+OEc20ig==" for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_SPKI" to "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAoD0bafxgZi+i18Wn6Qi1BCY5rsbIXPyVCkzv11CHrxm4Gtn16BINvlGRRk1ouTQZLZXk+97BvGAiCnrEkyf5iyVP3wh1VYSAN3ZQMd9Ui4Ejh2xAM3CIXJgu606f3g+ESjVQEhb9gnhVjGOWpY4IG3wffcQ58FfjyC0NJa8HfzTn4UEJq+K/hVYYQT7kM7cCoEiVS88BrTGyl+G2tVJ70gKziwt+uXtBmB9F3F1rcsL+MLreGJ93QX6jZVG78B8+iQ7+kIrjZB7jJLCwHQnjVcojgpRTQlOyXKn9vvHvROvXlxJiUhqsId3g7AfKOAP0L69lOzMjEOWrr2DaCVXpAwIDAQAB" for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_LOCAL_CA_DIR" to "/var/lib/certmonger/local" for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_KEY_TYPE" to "RSA" for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Setting "CERTMONGER_CA_NICKNAME" to "IPA" for child. Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Redirecting stdin to /dev/null, leaving stdout and stderr open for child "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:10 localhost certmonger[39855]: 2025-12-06 08:08:10 [39855] Running enrollment helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[39855]: Submitting request to "https://ipa.ooo.test/ipa/json". Dec 6 03:08:10 localhost certmonger[39855]: Certificate: "MIIFWzCCA8OgAwIBAgIBHTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08uVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4MDgxMFoXDTI3MTIwNzA4MDgxMFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNVBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKA9G2n8YGYvotfFp+kItQQmOa7GyFz8lQpM79dQh68ZuBrZ9egSDb5RkUZNaLk0GS2V5PvewbxgIgp6xJMn+YslT98IdVWEgDd2UDHfVIuBI4dsQDNwiFyYLutOn94PhEo1UBIW/YJ4VYxjlqWOCBt8H33EOfBX48gtDSWvB3805+FBCaviv4VWGEE+5DO3AqBIlUvPAa0xspfhtrVSe9ICs4sLfrl7QZgfRdxda3LC/jC63hifd0F+o2VRu/AfPokO/pCK42Qe4ySwsB0J41XKI4KUU0JTslyp/b7x70Tr15cSYlIarCHd4OwHyjgD9C+vZTszIxDlq69g2glV6QMCAwEAAaOCAewwggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEBBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3NwMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwcwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3JsL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFK5YjWAzIbEmIGPGXZJieTrnY8T4MIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRFU1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAMlEBaxNP4dXTgx8GolnzxNmfJU4ryTyrBmFjpkUG4WFToutO0xboNgd4UVOwM1Tss16T9ew3nPK26uEIdf/R8sCXq3GBKDFLMrkE1aKXODIlN/7Nzj5WDmfn2/q695SZbeFpLWhKy3DmVxW7k/mgb4bfLPDf4RKf3hK0SwwU7OolxeoKZxjgNxnNPLndlU62KvLqA3hyGn20HhoiQqewoUhqqRBueEi73gv/oO+oewXJPXU7j7IBDCTD9EB7C0HvmLknnf734a4BjHLPnBzv6YtEoy72pZJKE0mNxvm+u3irdkF2I5uxkTtPLgb8DcW32xKUR7I8zp4vH3HfTsv4GqYEq5BI4SKg3SH/gywi1MTtP0NpJttl/nKYoeRoulAoI7U/dLILkbAK7pQZF6EsZ1L6QSnBkeVKZ1nLJ5nzEt7vbWrin2rXKQT29ygMyQ7/n02AsM+8CDgXPaU7Ox5CYETUudQt7YQtSw7kbHT/u/jE2tEfqPzI7hxpHoJIwIptg==" Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Certificate submission still ongoing. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Certificate submission attempt complete. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Child status = 0. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Child output: Dec 6 03:08:10 localhost certmonger[37540]: "-----BEGIN CERTIFICATE----- Dec 6 03:08:10 localhost certmonger[37540]: MIIFWzCCA8OgAwIBAgIBHTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:10 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:10 localhost certmonger[37540]: MDgxMFoXDTI3MTIwNzA4MDgxMFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:10 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:10 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAKA9G2n8YGYvotfFp+kItQQmOa7GyFz8lQpM Dec 6 03:08:10 localhost certmonger[37540]: 79dQh68ZuBrZ9egSDb5RkUZNaLk0GS2V5PvewbxgIgp6xJMn+YslT98IdVWEgDd2 Dec 6 03:08:10 localhost certmonger[37540]: UDHfVIuBI4dsQDNwiFyYLutOn94PhEo1UBIW/YJ4VYxjlqWOCBt8H33EOfBX48gt Dec 6 03:08:10 localhost certmonger[37540]: DSWvB3805+FBCaviv4VWGEE+5DO3AqBIlUvPAa0xspfhtrVSe9ICs4sLfrl7QZgf Dec 6 03:08:10 localhost certmonger[37540]: Rdxda3LC/jC63hifd0F+o2VRu/AfPokO/pCK42Qe4ySwsB0J41XKI4KUU0JTslyp Dec 6 03:08:10 localhost certmonger[37540]: /b7x70Tr15cSYlIarCHd4OwHyjgD9C+vZTszIxDlq69g2glV6QMCAwEAAaOCAeww Dec 6 03:08:10 localhost certmonger[37540]: ggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:10 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:10 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:10 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:10 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:10 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFK5YjWAzIbEmIGPGXZJieTrn Dec 6 03:08:10 localhost certmonger[37540]: Y8T4MIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:10 localhost certmonger[37540]: dGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy Dec 6 03:08:10 localhost certmonger[37540]: bmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF Dec 6 03:08:10 localhost certmonger[37540]: U1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh Dec 6 03:08:10 localhost certmonger[37540]: cGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAMlEBaxNP4dXTgx8GolnzxNm Dec 6 03:08:10 localhost certmonger[37540]: fJU4ryTyrBmFjpkUG4WFToutO0xboNgd4UVOwM1Tss16T9ew3nPK26uEIdf/R8sC Dec 6 03:08:10 localhost certmonger[37540]: Xq3GBKDFLMrkE1aKXODIlN/7Nzj5WDmfn2/q695SZbeFpLWhKy3DmVxW7k/mgb4b Dec 6 03:08:10 localhost certmonger[37540]: fLPDf4RKf3hK0SwwU7OolxeoKZxjgNxnNPLndlU62KvLqA3hyGn20HhoiQqewoUh Dec 6 03:08:10 localhost certmonger[37540]: qqRBueEi73gv/oO+oewXJPXU7j7IBDCTD9EB7C0HvmLknnf734a4BjHLPnBzv6Yt Dec 6 03:08:10 localhost certmonger[37540]: Eoy72pZJKE0mNxvm+u3irdkF2I5uxkTtPLgb8DcW32xKUR7I8zp4vH3HfTsv4GqY Dec 6 03:08:10 localhost certmonger[37540]: Eq5BI4SKg3SH/gywi1MTtP0NpJttl/nKYoeRoulAoI7U/dLILkbAK7pQZF6EsZ1L Dec 6 03:08:10 localhost certmonger[37540]: 6QSnBkeVKZ1nLJ5nzEt7vbWrin2rXKQT29ygMyQ7/n02AsM+8CDgXPaU7Ox5CYET Dec 6 03:08:10 localhost certmonger[37540]: UudQt7YQtSw7kbHT/u/jE2tEfqPzI7hxpHoJIwIptg== Dec 6 03:08:10 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:10 localhost certmonger[37540]: " Dec 6 03:08:10 localhost certmonger[39857]: 2025-12-06 08:08:10 [39857] Postprocessing output "-----BEGIN CERTIFICATE----- Dec 6 03:08:10 localhost certmonger[39857]: MIIFWzCCA8OgAwIBAgIBHTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:10 localhost certmonger[39857]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:10 localhost certmonger[39857]: MDgxMFoXDTI3MTIwNzA4MDgxMFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:10 localhost certmonger[39857]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:10 localhost certmonger[39857]: hvcNAQEBBQADggEPADCCAQoCggEBAKA9G2n8YGYvotfFp+kItQQmOa7GyFz8lQpM Dec 6 03:08:10 localhost certmonger[39857]: 79dQh68ZuBrZ9egSDb5RkUZNaLk0GS2V5PvewbxgIgp6xJMn+YslT98IdVWEgDd2 Dec 6 03:08:10 localhost certmonger[39857]: UDHfVIuBI4dsQDNwiFyYLutOn94PhEo1UBIW/YJ4VYxjlqWOCBt8H33EOfBX48gt Dec 6 03:08:10 localhost certmonger[39857]: DSWvB3805+FBCaviv4VWGEE+5DO3AqBIlUvPAa0xspfhtrVSe9ICs4sLfrl7QZgf Dec 6 03:08:10 localhost certmonger[39857]: Rdxda3LC/jC63hifd0F+o2VRu/AfPokO/pCK42Qe4ySwsB0J41XKI4KUU0JTslyp Dec 6 03:08:10 localhost certmonger[39857]: /b7x70Tr15cSYlIarCHd4OwHyjgD9C+vZTszIxDlq69g2glV6QMCAwEAAaOCAeww Dec 6 03:08:10 localhost certmonger[39857]: ggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:10 localhost certmonger[39857]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:10 localhost certmonger[39857]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:10 localhost certmonger[39857]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:10 localhost certmonger[39857]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:10 localhost certmonger[39857]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFK5YjWAzIbEmIGPGXZJieTrn Dec 6 03:08:10 localhost certmonger[39857]: Y8T4MIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:10 localhost certmonger[39857]: dGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy Dec 6 03:08:10 localhost certmonger[39857]: bmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF Dec 6 03:08:10 localhost certmonger[39857]: U1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh Dec 6 03:08:10 localhost certmonger[39857]: cGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAMlEBaxNP4dXTgx8GolnzxNm Dec 6 03:08:10 localhost certmonger[39857]: fJU4ryTyrBmFjpkUG4WFToutO0xboNgd4UVOwM1Tss16T9ew3nPK26uEIdf/R8sC Dec 6 03:08:10 localhost certmonger[39857]: Xq3GBKDFLMrkE1aKXODIlN/7Nzj5WDmfn2/q695SZbeFpLWhKy3DmVxW7k/mgb4b Dec 6 03:08:10 localhost certmonger[39857]: fLPDf4RKf3hK0SwwU7OolxeoKZxjgNxnNPLndlU62KvLqA3hyGn20HhoiQqewoUh Dec 6 03:08:10 localhost certmonger[39857]: qqRBueEi73gv/oO+oewXJPXU7j7IBDCTD9EB7C0HvmLknnf734a4BjHLPnBzv6Yt Dec 6 03:08:10 localhost certmonger[39857]: Eoy72pZJKE0mNxvm+u3irdkF2I5uxkTtPLgb8DcW32xKUR7I8zp4vH3HfTsv4GqY Dec 6 03:08:10 localhost certmonger[39857]: Eq5BI4SKg3SH/gywi1MTtP0NpJttl/nKYoeRoulAoI7U/dLILkbAK7pQZF6EsZ1L Dec 6 03:08:10 localhost certmonger[39857]: 6QSnBkeVKZ1nLJ5nzEt7vbWrin2rXKQT29ygMyQ7/n02AsM+8CDgXPaU7Ox5CYET Dec 6 03:08:10 localhost certmonger[39857]: UudQt7YQtSw7kbHT/u/jE2tEfqPzI7hxpHoJIwIptg== Dec 6 03:08:10 localhost certmonger[39857]: -----END CERTIFICATE----- Dec 6 03:08:10 localhost certmonger[39857]: ". Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Certificate submission still ongoing. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Certificate submission postprocessing complete. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Child status = 0. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Child output: Dec 6 03:08:10 localhost certmonger[37540]: "{"certificate":"-----BEGIN CERTIFICATE-----\nMIIFWzCCA8OgAwIBAgIBHTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u\nVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4\nMDgxMFoXDTI3MTIwNzA4MDgxMFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV\nBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI\nhvcNAQEBBQADggEPADCCAQoCggEBAKA9G2n8YGYvotfFp+kItQQmOa7GyFz8lQpM\n79dQh68ZuBrZ9egSDb5RkUZNaLk0GS2V5PvewbxgIgp6xJMn+YslT98IdVWEgDd2\nUDHfVIuBI4dsQDNwiFyYLutOn94PhEo1UBIW/YJ4VYxjlqWOCBt8H33EOfBX48gt\nDSWvB3805+FBCaviv4VWGEE+5DO3AqBIlUvPAa0xspfhtrVSe9ICs4sLfrl7QZgf\nRdxda3LC/jC63hifd0F+o2VRu/AfPokO/pCK42Qe4ySwsB0J41XKI4KUU0JTslyp\n/b7x70Tr15cSYlIarCHd4OwHyjgD9C+vZTszIxDlq69g2glV6QMCAwEAAaOCAeww\nggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB\nBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw\nMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw\ncwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js\nL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD\nZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFK5YjWAzIbEmIGPGXZJieTrn\nY8T4MIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u\ndGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy\nbmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF\nU1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh\ncGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAMlEBaxNP4dXTgx8GolnzxNm\nfJU4ryTyrBmFjpkUG4WFToutO0xboNgd4UVOwM1Tss16T9ew3nPK26uEIdf/R8sC\nXq3GBKDFLMrkE1aKXODIlN/7Nzj5WDmfn2/q695SZbeFpLWhKy3DmVxW7k/mgb4b\nfLPDf4RKf3hK0SwwU7OolxeoKZxjgNxnNPLndlU62KvLqA3hyGn20HhoiQqewoUh\nqqRBueEi73gv/oO+oewXJPXU7j7IBDCTD9EB7C0HvmLknnf734a4BjHLPnBzv6Yt\nEoy72pZJKE0mNxvm+u3irdkF2I5uxkTtPLgb8DcW32xKUR7I8zp4vH3HfTsv4GqY\nEq5BI4SKg3SH/gywi1MTtP0NpJttl/nKYoeRoulAoI7U/dLILkbAK7pQZF6EsZ1L\n6QSnBkeVKZ1nLJ5nzEt7vbWrin2rXKQT29ygMyQ7/n02AsM+8CDgXPaU7Ox5CYET\nUudQt7YQtSw7kbHT/u/jE2tEfqPzI7hxpHoJIwIptg==\n-----END CERTIFICATE-----\n","key_checked":true} Dec 6 03:08:10 localhost certmonger[37540]: " Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Issued certificate is "-----BEGIN CERTIFICATE----- Dec 6 03:08:10 localhost certmonger[37540]: MIIFWzCCA8OgAwIBAgIBHTANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:10 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:10 localhost certmonger[37540]: MDgxMFoXDTI3MTIwNzA4MDgxMFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:10 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:10 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAKA9G2n8YGYvotfFp+kItQQmOa7GyFz8lQpM Dec 6 03:08:10 localhost certmonger[37540]: 79dQh68ZuBrZ9egSDb5RkUZNaLk0GS2V5PvewbxgIgp6xJMn+YslT98IdVWEgDd2 Dec 6 03:08:10 localhost certmonger[37540]: UDHfVIuBI4dsQDNwiFyYLutOn94PhEo1UBIW/YJ4VYxjlqWOCBt8H33EOfBX48gt Dec 6 03:08:10 localhost certmonger[37540]: DSWvB3805+FBCaviv4VWGEE+5DO3AqBIlUvPAa0xspfhtrVSe9ICs4sLfrl7QZgf Dec 6 03:08:10 localhost certmonger[37540]: Rdxda3LC/jC63hifd0F+o2VRu/AfPokO/pCK42Qe4ySwsB0J41XKI4KUU0JTslyp Dec 6 03:08:10 localhost certmonger[37540]: /b7x70Tr15cSYlIarCHd4OwHyjgD9C+vZTszIxDlq69g2glV6QMCAwEAAaOCAeww Dec 6 03:08:10 localhost certmonger[37540]: ggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:10 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:10 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:10 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:10 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:10 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFK5YjWAzIbEmIGPGXZJieTrn Dec 6 03:08:10 localhost certmonger[37540]: Y8T4MIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:10 localhost certmonger[37540]: dGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy Dec 6 03:08:10 localhost certmonger[37540]: bmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF Dec 6 03:08:10 localhost certmonger[37540]: U1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh Dec 6 03:08:10 localhost certmonger[37540]: cGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAMlEBaxNP4dXTgx8GolnzxNm Dec 6 03:08:10 localhost certmonger[37540]: fJU4ryTyrBmFjpkUG4WFToutO0xboNgd4UVOwM1Tss16T9ew3nPK26uEIdf/R8sC Dec 6 03:08:10 localhost certmonger[37540]: Xq3GBKDFLMrkE1aKXODIlN/7Nzj5WDmfn2/q695SZbeFpLWhKy3DmVxW7k/mgb4b Dec 6 03:08:10 localhost certmonger[37540]: fLPDf4RKf3hK0SwwU7OolxeoKZxjgNxnNPLndlU62KvLqA3hyGn20HhoiQqewoUh Dec 6 03:08:10 localhost certmonger[37540]: qqRBueEi73gv/oO+oewXJPXU7j7IBDCTD9EB7C0HvmLknnf734a4BjHLPnBzv6Yt Dec 6 03:08:10 localhost certmonger[37540]: Eoy72pZJKE0mNxvm+u3irdkF2I5uxkTtPLgb8DcW32xKUR7I8zp4vH3HfTsv4GqY Dec 6 03:08:10 localhost certmonger[37540]: Eq5BI4SKg3SH/gywi1MTtP0NpJttl/nKYoeRoulAoI7U/dLILkbAK7pQZF6EsZ1L Dec 6 03:08:10 localhost certmonger[37540]: 6QSnBkeVKZ1nLJ5nzEt7vbWrin2rXKQT29ygMyQ7/n02AsM+8CDgXPaU7Ox5CYET Dec 6 03:08:10 localhost certmonger[37540]: UudQt7YQtSw7kbHT/u/jE2tEfqPzI7hxpHoJIwIptg== Dec 6 03:08:10 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:10 localhost certmonger[37540]: ". Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Certificate issued (0 chain certificates, 0 roots). Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] No hooks set for pre-save command. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost certmonger[39881]: Certificate in file "/etc/pki/tls/certs/libvirt-server-cert.crt" issued by CA and saved. Dec 6 03:08:10 localhost certmonger[37540]: 2025-12-06 08:08:10 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 03:08:10 localhost python3[39897]: ansible-certificate_request Invoked with name=libvirt-client-cert dns=['np0005548798.internalapi.ooo.test'] principal=['libvirt/np0005548798.internalapi.ooo.test@OOO.TEST'] directory=/etc/pki/tls key_size=2048 wait=True run_after=# Copy cert and key to libvirt dirs#012cp /etc/pki/tls/certs/libvirt-client-cert.crt /etc/pki/libvirt/clientcert.pem#012cp /etc/pki/tls/private/libvirt-client-cert.key /etc/pki/libvirt/private/clientkey.pem#012podman exec nova_virtproxyd virt-admin server-update-tls virtproxyd || systemctl reload tripleo_nova_virtproxyd#012 ca=ipa __header=##012# Ansible managed#012##012 provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None owner=None group=None run_before=None Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_REQ_SUBJECT" to "CN=np0005548798.internalapi.ooo.test" for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_REQ_HOSTNAME" to "np0005548798.internalapi.ooo.test Dec 6 03:08:11 localhost certmonger[39907]: " for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_REQ_PRINCIPAL" to "libvirt/np0005548798.internalapi.ooo.test@OOO.TEST Dec 6 03:08:11 localhost certmonger[39907]: " for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_OPERATION" to "SUBMIT" for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_CSR" to "-----BEGIN CERTIFICATE REQUEST----- Dec 6 03:08:11 localhost certmonger[39907]: MIID2DCCAsACAQAwLDEqMCgGA1UEAxMhbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBp Dec 6 03:08:11 localhost certmonger[39907]: Lm9vby50ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAj2zYGah1 Dec 6 03:08:11 localhost certmonger[39907]: fO8tFJG7I2oDMulCR3AZFz0CGEcvCAUoto9lM9Zz3LUexGUPLBl6edBEH3croXcu Dec 6 03:08:11 localhost certmonger[39907]: 117y4mbEcN25wCAcBtIMRntdn8TBAb4vHjl5xD7oBuXkywl6JiasE+PDANzZl75q Dec 6 03:08:11 localhost certmonger[39907]: EYJ0kjZqofyyVCcMI+OLwIJgDr16E3Xf5qFOdEO0dWkVcBny9DPjdYRlfYMjaTsw Dec 6 03:08:11 localhost certmonger[39907]: 0siu8lII+7yB0TXiF7fz39cY7PzkEhKkcsqOsTklcmOcCL2GKTzmRbIE6HxWFjiM Dec 6 03:08:11 localhost certmonger[39907]: EhIZp67GXurFwMEYaE4IytdxQ93CQA9ZE3HSVs0/IMRQRMeBGGf+nqSS3L9tn3/o Dec 6 03:08:11 localhost certmonger[39907]: 09UK/rXY8jPv9QIDAQABoIIBZTArBgkqhkiG9w0BCRQxHh4cADIAMAAyADUAMQAy Dec 6 03:08:11 localhost certmonger[39907]: ADAANgAwADgAMAA4ADEAMTCCATQGCSqGSIb3DQEJDjGCASUwggEhMAsGA1UdDwQE Dec 6 03:08:11 localhost certmonger[39907]: AwIFoDCBxQYDVR0RBIG9MIG6giFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:11 localhost certmonger[39907]: LnRlc3SgQgYKKwYBBAGCNxQCA6A0DDJsaWJ2aXJ0L25wMDAwNTU0ODc5OC5pbnRl Dec 6 03:08:11 localhost certmonger[39907]: cm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBRBgYrBgEFAgKgRzBFoAobCE9PTy5U Dec 6 03:08:11 localhost certmonger[39907]: RVNUoTcwNaADAgEBoS4wLBsHbGlidmlydBshbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:11 localhost certmonger[39907]: YXBpLm9vby50ZXN0MB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAMBgNV Dec 6 03:08:11 localhost certmonger[39907]: HRMBAf8EAjAAMB0GA1UdDgQWBBTP6q2IZGMTTAw+Q1XHvT0pf1furzANBgkqhkiG Dec 6 03:08:11 localhost certmonger[39907]: 9w0BAQsFAAOCAQEAKiH3+ssKBhw+hdC0fFhmmaXghLDcud6gFjqv23AMwcCVrwqd Dec 6 03:08:11 localhost certmonger[39907]: 4SVVS+S/uvmaAmsModH/sNn46YjacdaRqGMZ/gS+Beo+sNhpSbhtCuRmJ0NTuAj8 Dec 6 03:08:11 localhost certmonger[39907]: Jq+J2cNM1XAAf3A5K1jSXbQPYDTFBMq5Fss6VXdUob64NR+1w7FgcD28gymaKLSd Dec 6 03:08:11 localhost certmonger[39907]: D3/vqb6l6qPGlUSMH/7/IUxz2aO0lipy1Hos/mThkuavyu0QaKlCM1vPJzTDFJfR Dec 6 03:08:11 localhost certmonger[39907]: zc1m9xoE+kLJL3qt9sMnJyw3EZODQMYa4HA6zk+vdzO2c1p4gsbBYKpU1hQKjryg Dec 6 03:08:11 localhost certmonger[39907]: 7wRQfSNEJ8mNqgG1Loug7s59p4VAYIYHOrHeNQ== Dec 6 03:08:11 localhost certmonger[39907]: -----END CERTIFICATE REQUEST----- Dec 6 03:08:11 localhost certmonger[39907]: " for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_SPKAC" to "MIICQDCCASgwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCPbNgZqHV87y0UkbsjagMy6UJHcBkXPQIYRy8IBSi2j2Uz1nPctR7EZQ8sGXp50EQfdyuhdy7XXvLiZsRw3bnAIBwG0gxGe12fxMEBvi8eOXnEPugG5eTLCXomJqwT48MA3NmXvmoRgnSSNmqh/LJUJwwj44vAgmAOvXoTdd/moU50Q7R1aRVwGfL0M+N1hGV9gyNpOzDSyK7yUgj7vIHRNeIXt/Pf1xjs/OQSEqRyyo6xOSVyY5wIvYYpPOZFsgTofFYWOIwSEhmnrsZe6sXAwRhoTgjK13FD3cJAD1kTcdJWzT8gxFBEx4EYZ/6epJLcv22ff+jT1Qr+tdjyM+/1AgMBAAEWADANBgkqhkiG9w0BAQsFAAOCAQEAHac5zJuCJTAER4HhuGoJ0CwHAb+i++VK8nohwMIlLltHpONAP9qOj+hd1rzHESUX+4msvefKrQeY8pkx7pIWyJTpk/h+eI4vjevIjCmMsrF3hsEUpalD+FuDvYS38AnFnu6+muvyB9YA7htJvKsY3ttLI5XUugp0kw8lYV20ciF3HZ1CFJCv6pKqhtBqG+D1gy55/9VJjpxYqWQe5pLYqs+vBLN4wbYXa94XjPojtQ8Q9NOTs1VBQslH3fjhc2Xo3QfLSmNRI3DLBGg5jTRdjFWFBePgJJZ5/+6RRdOpv+HbDUvluZGMZDRnW8u0s4oaBh+GqLYxD9SeNF7NSpGXhg==" for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_SPKI" to "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAj2zYGah1fO8tFJG7I2oDMulCR3AZFz0CGEcvCAUoto9lM9Zz3LUexGUPLBl6edBEH3croXcu117y4mbEcN25wCAcBtIMRntdn8TBAb4vHjl5xD7oBuXkywl6JiasE+PDANzZl75qEYJ0kjZqofyyVCcMI+OLwIJgDr16E3Xf5qFOdEO0dWkVcBny9DPjdYRlfYMjaTsw0siu8lII+7yB0TXiF7fz39cY7PzkEhKkcsqOsTklcmOcCL2GKTzmRbIE6HxWFjiMEhIZp67GXurFwMEYaE4IytdxQ93CQA9ZE3HSVs0/IMRQRMeBGGf+nqSS3L9tn3/o09UK/rXY8jPv9QIDAQAB" for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_LOCAL_CA_DIR" to "/var/lib/certmonger/local" for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_KEY_TYPE" to "RSA" for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Setting "CERTMONGER_CA_NICKNAME" to "IPA" for child. Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Redirecting stdin to /dev/null, leaving stdout and stderr open for child "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:11 localhost certmonger[39907]: 2025-12-06 08:08:11 [39907] Running enrollment helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[39907]: Submitting request to "https://ipa.ooo.test/ipa/json". Dec 6 03:08:11 localhost certmonger[39907]: Certificate: "MIIFWzCCA8OgAwIBAgIBHjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08uVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4MDgxMVoXDTI3MTIwNzA4MDgxMVowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNVBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAI9s2BmodXzvLRSRuyNqAzLpQkdwGRc9AhhHLwgFKLaPZTPWc9y1HsRlDywZennQRB93K6F3Ltde8uJmxHDducAgHAbSDEZ7XZ/EwQG+Lx45ecQ+6Abl5MsJeiYmrBPjwwDc2Ze+ahGCdJI2aqH8slQnDCPji8CCYA69ehN13+ahTnRDtHVpFXAZ8vQz43WEZX2DI2k7MNLIrvJSCPu8gdE14he389/XGOz85BISpHLKjrE5JXJjnAi9hik85kWyBOh8VhY4jBISGaeuxl7qxcDBGGhOCMrXcUPdwkAPWRNx0lbNPyDEUETHgRhn/p6kkty/bZ9/6NPVCv612PIz7/UCAwEAAaOCAewwggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEBBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3NwMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwcwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3JsL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHm5PDF3H9lZSkj1lTsrRPE+C8ZwMIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRFU1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAGPW9gkI2xhCNw/+oNZZnPtzbEmWB1M94/KYsY/0gdMaSElYoV+LPJWK3gWdbvm8OP51BXDYJLjmml2MwxiuyMY0iNodRZT/1yigcr9ovChFUl05o0Or7fI+zWeCHvt/teZPj/MVwQNxEoaMRi94xNUQCL6sPA5nufGZZlx1+cRbUDHJ+9YTgHKch4xqrWUTNyICLwBEVLjFFvwXn9byX8Q9RbK+OqpAPDMNxlMv4GWxy7RAcqhdaG3UoIs5Tpq4HZMffOSpcTBU7yndxJ5fcQnRDyUqz4RJXJF8EVJe4+Wemsbb4xeFJ19vImXe+cd9nQ10oiMy/fKKZKB25ZOTqyLLKOpTvKNaWwlHf1bpy5HoRAP9dr4LXLkWj60YYxGYl5oKOtfGfeiI/jaDBVppMyduvdtrB8lZvlH1qBMqULoy7IN4HjRxTSdSfml8DjeHhlvU44am4SeE9a1WmVTup+g/RhAtlrLFz3a9Yj5+nZY2vNQSO+Bh5+rvizfFaZX56Q==" Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Certificate submission still ongoing. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Certificate submission attempt complete. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Child status = 0. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Child output: Dec 6 03:08:11 localhost certmonger[37540]: "-----BEGIN CERTIFICATE----- Dec 6 03:08:11 localhost certmonger[37540]: MIIFWzCCA8OgAwIBAgIBHjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:11 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:11 localhost certmonger[37540]: MDgxMVoXDTI3MTIwNzA4MDgxMVowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:11 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:11 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAI9s2BmodXzvLRSRuyNqAzLpQkdwGRc9AhhH Dec 6 03:08:11 localhost certmonger[37540]: LwgFKLaPZTPWc9y1HsRlDywZennQRB93K6F3Ltde8uJmxHDducAgHAbSDEZ7XZ/E Dec 6 03:08:11 localhost certmonger[37540]: wQG+Lx45ecQ+6Abl5MsJeiYmrBPjwwDc2Ze+ahGCdJI2aqH8slQnDCPji8CCYA69 Dec 6 03:08:11 localhost certmonger[37540]: ehN13+ahTnRDtHVpFXAZ8vQz43WEZX2DI2k7MNLIrvJSCPu8gdE14he389/XGOz8 Dec 6 03:08:11 localhost certmonger[37540]: 5BISpHLKjrE5JXJjnAi9hik85kWyBOh8VhY4jBISGaeuxl7qxcDBGGhOCMrXcUPd Dec 6 03:08:11 localhost certmonger[37540]: wkAPWRNx0lbNPyDEUETHgRhn/p6kkty/bZ9/6NPVCv612PIz7/UCAwEAAaOCAeww Dec 6 03:08:11 localhost certmonger[37540]: ggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:11 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:11 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:11 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:11 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:11 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHm5PDF3H9lZSkj1lTsrRPE+ Dec 6 03:08:11 localhost certmonger[37540]: C8ZwMIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:11 localhost certmonger[37540]: dGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy Dec 6 03:08:11 localhost certmonger[37540]: bmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF Dec 6 03:08:11 localhost certmonger[37540]: U1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh Dec 6 03:08:11 localhost certmonger[37540]: cGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAGPW9gkI2xhCNw/+oNZZnPtz Dec 6 03:08:11 localhost certmonger[37540]: bEmWB1M94/KYsY/0gdMaSElYoV+LPJWK3gWdbvm8OP51BXDYJLjmml2MwxiuyMY0 Dec 6 03:08:11 localhost certmonger[37540]: iNodRZT/1yigcr9ovChFUl05o0Or7fI+zWeCHvt/teZPj/MVwQNxEoaMRi94xNUQ Dec 6 03:08:11 localhost certmonger[37540]: CL6sPA5nufGZZlx1+cRbUDHJ+9YTgHKch4xqrWUTNyICLwBEVLjFFvwXn9byX8Q9 Dec 6 03:08:11 localhost certmonger[37540]: RbK+OqpAPDMNxlMv4GWxy7RAcqhdaG3UoIs5Tpq4HZMffOSpcTBU7yndxJ5fcQnR Dec 6 03:08:11 localhost certmonger[37540]: DyUqz4RJXJF8EVJe4+Wemsbb4xeFJ19vImXe+cd9nQ10oiMy/fKKZKB25ZOTqyLL Dec 6 03:08:11 localhost certmonger[37540]: KOpTvKNaWwlHf1bpy5HoRAP9dr4LXLkWj60YYxGYl5oKOtfGfeiI/jaDBVppMydu Dec 6 03:08:11 localhost certmonger[37540]: vdtrB8lZvlH1qBMqULoy7IN4HjRxTSdSfml8DjeHhlvU44am4SeE9a1WmVTup+g/ Dec 6 03:08:11 localhost certmonger[37540]: RhAtlrLFz3a9Yj5+nZY2vNQSO+Bh5+rvizfFaZX56Q== Dec 6 03:08:11 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:11 localhost certmonger[37540]: " Dec 6 03:08:11 localhost certmonger[39909]: 2025-12-06 08:08:11 [39909] Postprocessing output "-----BEGIN CERTIFICATE----- Dec 6 03:08:11 localhost certmonger[39909]: MIIFWzCCA8OgAwIBAgIBHjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:11 localhost certmonger[39909]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:11 localhost certmonger[39909]: MDgxMVoXDTI3MTIwNzA4MDgxMVowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:11 localhost certmonger[39909]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:11 localhost certmonger[39909]: hvcNAQEBBQADggEPADCCAQoCggEBAI9s2BmodXzvLRSRuyNqAzLpQkdwGRc9AhhH Dec 6 03:08:11 localhost certmonger[39909]: LwgFKLaPZTPWc9y1HsRlDywZennQRB93K6F3Ltde8uJmxHDducAgHAbSDEZ7XZ/E Dec 6 03:08:11 localhost certmonger[39909]: wQG+Lx45ecQ+6Abl5MsJeiYmrBPjwwDc2Ze+ahGCdJI2aqH8slQnDCPji8CCYA69 Dec 6 03:08:11 localhost certmonger[39909]: ehN13+ahTnRDtHVpFXAZ8vQz43WEZX2DI2k7MNLIrvJSCPu8gdE14he389/XGOz8 Dec 6 03:08:11 localhost certmonger[39909]: 5BISpHLKjrE5JXJjnAi9hik85kWyBOh8VhY4jBISGaeuxl7qxcDBGGhOCMrXcUPd Dec 6 03:08:11 localhost certmonger[39909]: wkAPWRNx0lbNPyDEUETHgRhn/p6kkty/bZ9/6NPVCv612PIz7/UCAwEAAaOCAeww Dec 6 03:08:11 localhost certmonger[39909]: ggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:11 localhost certmonger[39909]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:11 localhost certmonger[39909]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:11 localhost certmonger[39909]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:11 localhost certmonger[39909]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:11 localhost certmonger[39909]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHm5PDF3H9lZSkj1lTsrRPE+ Dec 6 03:08:11 localhost certmonger[39909]: C8ZwMIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:11 localhost certmonger[39909]: dGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy Dec 6 03:08:11 localhost certmonger[39909]: bmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF Dec 6 03:08:11 localhost certmonger[39909]: U1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh Dec 6 03:08:11 localhost certmonger[39909]: cGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAGPW9gkI2xhCNw/+oNZZnPtz Dec 6 03:08:11 localhost certmonger[39909]: bEmWB1M94/KYsY/0gdMaSElYoV+LPJWK3gWdbvm8OP51BXDYJLjmml2MwxiuyMY0 Dec 6 03:08:11 localhost certmonger[39909]: iNodRZT/1yigcr9ovChFUl05o0Or7fI+zWeCHvt/teZPj/MVwQNxEoaMRi94xNUQ Dec 6 03:08:11 localhost certmonger[39909]: CL6sPA5nufGZZlx1+cRbUDHJ+9YTgHKch4xqrWUTNyICLwBEVLjFFvwXn9byX8Q9 Dec 6 03:08:11 localhost certmonger[39909]: RbK+OqpAPDMNxlMv4GWxy7RAcqhdaG3UoIs5Tpq4HZMffOSpcTBU7yndxJ5fcQnR Dec 6 03:08:11 localhost certmonger[39909]: DyUqz4RJXJF8EVJe4+Wemsbb4xeFJ19vImXe+cd9nQ10oiMy/fKKZKB25ZOTqyLL Dec 6 03:08:11 localhost certmonger[39909]: KOpTvKNaWwlHf1bpy5HoRAP9dr4LXLkWj60YYxGYl5oKOtfGfeiI/jaDBVppMydu Dec 6 03:08:11 localhost certmonger[39909]: vdtrB8lZvlH1qBMqULoy7IN4HjRxTSdSfml8DjeHhlvU44am4SeE9a1WmVTup+g/ Dec 6 03:08:11 localhost certmonger[39909]: RhAtlrLFz3a9Yj5+nZY2vNQSO+Bh5+rvizfFaZX56Q== Dec 6 03:08:11 localhost certmonger[39909]: -----END CERTIFICATE----- Dec 6 03:08:11 localhost certmonger[39909]: ". Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Certificate submission still ongoing. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Certificate submission postprocessing complete. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Child status = 0. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Child output: Dec 6 03:08:11 localhost certmonger[37540]: "{"certificate":"-----BEGIN CERTIFICATE-----\nMIIFWzCCA8OgAwIBAgIBHjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u\nVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4\nMDgxMVoXDTI3MTIwNzA4MDgxMVowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV\nBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI\nhvcNAQEBBQADggEPADCCAQoCggEBAI9s2BmodXzvLRSRuyNqAzLpQkdwGRc9AhhH\nLwgFKLaPZTPWc9y1HsRlDywZennQRB93K6F3Ltde8uJmxHDducAgHAbSDEZ7XZ/E\nwQG+Lx45ecQ+6Abl5MsJeiYmrBPjwwDc2Ze+ahGCdJI2aqH8slQnDCPji8CCYA69\nehN13+ahTnRDtHVpFXAZ8vQz43WEZX2DI2k7MNLIrvJSCPu8gdE14he389/XGOz8\n5BISpHLKjrE5JXJjnAi9hik85kWyBOh8VhY4jBISGaeuxl7qxcDBGGhOCMrXcUPd\nwkAPWRNx0lbNPyDEUETHgRhn/p6kkty/bZ9/6NPVCv612PIz7/UCAwEAAaOCAeww\nggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB\nBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw\nMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw\ncwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js\nL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD\nZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHm5PDF3H9lZSkj1lTsrRPE+\nC8ZwMIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u\ndGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy\nbmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF\nU1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh\ncGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAGPW9gkI2xhCNw/+oNZZnPtz\nbEmWB1M94/KYsY/0gdMaSElYoV+LPJWK3gWdbvm8OP51BXDYJLjmml2MwxiuyMY0\niNodRZT/1yigcr9ovChFUl05o0Or7fI+zWeCHvt/teZPj/MVwQNxEoaMRi94xNUQ\nCL6sPA5nufGZZlx1+cRbUDHJ+9YTgHKch4xqrWUTNyICLwBEVLjFFvwXn9byX8Q9\nRbK+OqpAPDMNxlMv4GWxy7RAcqhdaG3UoIs5Tpq4HZMffOSpcTBU7yndxJ5fcQnR\nDyUqz4RJXJF8EVJe4+Wemsbb4xeFJ19vImXe+cd9nQ10oiMy/fKKZKB25ZOTqyLL\nKOpTvKNaWwlHf1bpy5HoRAP9dr4LXLkWj60YYxGYl5oKOtfGfeiI/jaDBVppMydu\nvdtrB8lZvlH1qBMqULoy7IN4HjRxTSdSfml8DjeHhlvU44am4SeE9a1WmVTup+g/\nRhAtlrLFz3a9Yj5+nZY2vNQSO+Bh5+rvizfFaZX56Q==\n-----END CERTIFICATE-----\n","key_checked":true} Dec 6 03:08:11 localhost certmonger[37540]: " Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Issued certificate is "-----BEGIN CERTIFICATE----- Dec 6 03:08:11 localhost certmonger[37540]: MIIFWzCCA8OgAwIBAgIBHjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:11 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:11 localhost certmonger[37540]: MDgxMVoXDTI3MTIwNzA4MDgxMVowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:11 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:11 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAI9s2BmodXzvLRSRuyNqAzLpQkdwGRc9AhhH Dec 6 03:08:11 localhost certmonger[37540]: LwgFKLaPZTPWc9y1HsRlDywZennQRB93K6F3Ltde8uJmxHDducAgHAbSDEZ7XZ/E Dec 6 03:08:11 localhost certmonger[37540]: wQG+Lx45ecQ+6Abl5MsJeiYmrBPjwwDc2Ze+ahGCdJI2aqH8slQnDCPji8CCYA69 Dec 6 03:08:11 localhost certmonger[37540]: ehN13+ahTnRDtHVpFXAZ8vQz43WEZX2DI2k7MNLIrvJSCPu8gdE14he389/XGOz8 Dec 6 03:08:11 localhost certmonger[37540]: 5BISpHLKjrE5JXJjnAi9hik85kWyBOh8VhY4jBISGaeuxl7qxcDBGGhOCMrXcUPd Dec 6 03:08:11 localhost certmonger[37540]: wkAPWRNx0lbNPyDEUETHgRhn/p6kkty/bZ9/6NPVCv612PIz7/UCAwEAAaOCAeww Dec 6 03:08:11 localhost certmonger[37540]: ggHoMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:11 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:11 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:11 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:11 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:11 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHm5PDF3H9lZSkj1lTsrRPE+ Dec 6 03:08:11 localhost certmonger[37540]: C8ZwMIHFBgNVHREEgb0wgbqCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:11 localhost certmonger[37540]: dGVzdKBCBgorBgEEAYI3FAIDoDQMMmxpYnZpcnQvbnAwMDA1NTQ4Nzk4LmludGVy Dec 6 03:08:11 localhost certmonger[37540]: bmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFEGBisGAQUCAqBHMEWgChsIT09PLlRF Dec 6 03:08:11 localhost certmonger[37540]: U1ShNzA1oAMCAQGhLjAsGwdsaWJ2aXJ0GyFucDAwMDU1NDg3OTguaW50ZXJuYWxh Dec 6 03:08:11 localhost certmonger[37540]: cGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAGPW9gkI2xhCNw/+oNZZnPtz Dec 6 03:08:11 localhost certmonger[37540]: bEmWB1M94/KYsY/0gdMaSElYoV+LPJWK3gWdbvm8OP51BXDYJLjmml2MwxiuyMY0 Dec 6 03:08:11 localhost certmonger[37540]: iNodRZT/1yigcr9ovChFUl05o0Or7fI+zWeCHvt/teZPj/MVwQNxEoaMRi94xNUQ Dec 6 03:08:11 localhost certmonger[37540]: CL6sPA5nufGZZlx1+cRbUDHJ+9YTgHKch4xqrWUTNyICLwBEVLjFFvwXn9byX8Q9 Dec 6 03:08:11 localhost certmonger[37540]: RbK+OqpAPDMNxlMv4GWxy7RAcqhdaG3UoIs5Tpq4HZMffOSpcTBU7yndxJ5fcQnR Dec 6 03:08:11 localhost certmonger[37540]: DyUqz4RJXJF8EVJe4+Wemsbb4xeFJ19vImXe+cd9nQ10oiMy/fKKZKB25ZOTqyLL Dec 6 03:08:11 localhost certmonger[37540]: KOpTvKNaWwlHf1bpy5HoRAP9dr4LXLkWj60YYxGYl5oKOtfGfeiI/jaDBVppMydu Dec 6 03:08:11 localhost certmonger[37540]: vdtrB8lZvlH1qBMqULoy7IN4HjRxTSdSfml8DjeHhlvU44am4SeE9a1WmVTup+g/ Dec 6 03:08:11 localhost certmonger[37540]: RhAtlrLFz3a9Yj5+nZY2vNQSO+Bh5+rvizfFaZX56Q== Dec 6 03:08:11 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:11 localhost certmonger[37540]: ". Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Certificate issued (0 chain certificates, 0 roots). Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] No hooks set for pre-save command. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:11 localhost certmonger[39930]: Certificate in file "/etc/pki/tls/certs/libvirt-client-cert.crt" issued by CA and saved. Dec 6 03:08:11 localhost certmonger[37540]: 2025-12-06 08:08:11 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 03:08:12 localhost python3[39946]: ansible-certificate_request Invoked with name=qemu-server-cert dns=['np0005548798.internalapi.ooo.test'] owner=root group=qemu principal=['qemu/np0005548798.internalapi.ooo.test@OOO.TEST'] directory=/etc/pki/tls key_size=2048 wait=True run_after=# Copy cert and key to qemu dir#012cp /etc/ipa/ca.crt /etc/pki/qemu/ca-cert.pem#012chown root:root /etc/pki/qemu/ca-cert.pem#012chmod 644 /etc/pki/qemu/ca-cert.pem#012cp -a /etc/pki/tls/certs/qemu-server-cert.crt /etc/pki/qemu/server-cert.pem#012cp -a /etc/pki/tls/private/qemu-server-cert.key /etc/pki/qemu/server-key.pem#012chgrp qemu /etc/pki/qemu/server-*#012chmod 0640 /etc/pki/qemu/server-cert.pem#012chmod 0640 /etc/pki/qemu/server-key.pem#012 ca=ipa __header=##012# Ansible managed#012##012 provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None run_before=None Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_REQ_SUBJECT" to "CN=np0005548798.internalapi.ooo.test" for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_REQ_HOSTNAME" to "np0005548798.internalapi.ooo.test Dec 6 03:08:12 localhost certmonger[39956]: " for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_REQ_PRINCIPAL" to "qemu/np0005548798.internalapi.ooo.test@OOO.TEST Dec 6 03:08:12 localhost certmonger[39956]: " for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_OPERATION" to "SUBMIT" for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_CSR" to "-----BEGIN CERTIFICATE REQUEST----- Dec 6 03:08:12 localhost certmonger[39956]: MIID0jCCAroCAQAwLDEqMCgGA1UEAxMhbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBp Dec 6 03:08:12 localhost certmonger[39956]: Lm9vby50ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAtWhdJuVW Dec 6 03:08:12 localhost certmonger[39956]: 9mfUvXVJtq98P5bOPpwnN2sMUJw1pswt5zc2EBhW5Rfu8f/aNY6LAJYiMDs+AdTN Dec 6 03:08:12 localhost certmonger[39956]: hRlhVok9OuJcGtVIhRgkszPIzfNaXS5mmP6I2Aioo02XsxzjaFEDLp4f1wVjx5k+ Dec 6 03:08:12 localhost certmonger[39956]: M6zd+cb4uYM/vubwrvmfxtZgpJWQtcTtv6RdD5U6DysrUiQbP1FEF4FdD8LQbr3u Dec 6 03:08:12 localhost certmonger[39956]: 3ou6/oqFTxGW2m14JQ4QWlF1LVhgfBx1ZyGOo040dnNtWpuMGujUl4FjLibUWywK Dec 6 03:08:12 localhost certmonger[39956]: FUy/5FRn4aDYSY5qcXCuykipnPG42KyCkZzgL0OvewDCpleiGyP11peUEHVsxL+e Dec 6 03:08:12 localhost certmonger[39956]: dFt9Z9/mRX0JUQIDAQABoIIBXzArBgkqhkiG9w0BCRQxHh4cADIAMAAyADUAMQAy Dec 6 03:08:12 localhost certmonger[39956]: ADAANgAwADgAMAA4ADEAMjCCAS4GCSqGSIb3DQEJDjGCAR8wggEbMAsGA1UdDwQE Dec 6 03:08:12 localhost certmonger[39956]: AwIFoDCBvwYDVR0RBIG3MIG0giFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:12 localhost certmonger[39956]: LnRlc3SgPwYKKwYBBAGCNxQCA6AxDC9xZW11L25wMDAwNTU0ODc5OC5pbnRlcm5h Dec 6 03:08:12 localhost certmonger[39956]: bGFwaS5vb28udGVzdEBPT08uVEVTVKBOBgYrBgEFAgKgRDBCoAobCE9PTy5URVNU Dec 6 03:08:12 localhost certmonger[39956]: oTQwMqADAgEBoSswKRsEcWVtdRshbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBpLm9v Dec 6 03:08:12 localhost certmonger[39956]: by50ZXN0MB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAMBgNVHRMBAf8E Dec 6 03:08:12 localhost certmonger[39956]: AjAAMB0GA1UdDgQWBBSo6vuy02wNJjPIDPcHqjCJcXFSujANBgkqhkiG9w0BAQsF Dec 6 03:08:12 localhost certmonger[39956]: AAOCAQEAbSLmo8Fsm6iYUeAWMj8iJNr+iDIJWtAbv4WIeEozPsHJgVMFUN0zKyUB Dec 6 03:08:12 localhost certmonger[39956]: 2EJvZnqawjOruYH4kaTA10ul0QRFD5o+2/VZ9a9+Cl/RTrgkq0kd766DTGw3ZhK/ Dec 6 03:08:12 localhost certmonger[39956]: VwVd7w+HG0VEgFmyfyYBTmfJ8jMUhwPqPBRBwO84YU7hZnbLupudJICb7MOGwNIQ Dec 6 03:08:12 localhost certmonger[39956]: f5Rb7Yjpz6345bRTOSP1tlI6F2T8LCsNs/bee8YRjKlyrlPj61uNlCO0I0dFYf0T Dec 6 03:08:12 localhost certmonger[39956]: D0N5BPc9guSJXWyTF60fX/+HokXgOJudeV2zjY0bBifvPNmEpVAZmGlw7JvN5XoJ Dec 6 03:08:12 localhost certmonger[39956]: 9jzkzhbEPijEmoLq0LOaE7iZnGQ42Q== Dec 6 03:08:12 localhost certmonger[39956]: -----END CERTIFICATE REQUEST----- Dec 6 03:08:12 localhost certmonger[39956]: " for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_SPKAC" to "MIICQDCCASgwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC1aF0m5Vb2Z9S9dUm2r3w/ls4+nCc3awxQnDWmzC3nNzYQGFblF+7x/9o1josAliIwOz4B1M2FGWFWiT064lwa1UiFGCSzM8jN81pdLmaY/ojYCKijTZezHONoUQMunh/XBWPHmT4zrN35xvi5gz++5vCu+Z/G1mCklZC1xO2/pF0PlToPKytSJBs/UUQXgV0PwtBuve7ei7r+ioVPEZbabXglDhBaUXUtWGB8HHVnIY6jTjR2c21am4wa6NSXgWMuJtRbLAoVTL/kVGfhoNhJjmpxcK7KSKmc8bjYrIKRnOAvQ697AMKmV6IbI/XWl5QQdWzEv550W31n3+ZFfQlRAgMBAAEWADANBgkqhkiG9w0BAQsFAAOCAQEAJZE5Zpon14WA3j7O1Ho3out0b3LENrrUth77bZAlMjTBc9gBsXigLTFlBV9FsjNiQfBTLZYGUt6AeRqlq5DZdMeiUXO1KEuqqbvX8EZqvnUaoJ6LbGYCbrCSJ4LUQf5Q2hNWfCDgKGHtxpSAziYdtcFm4AXscE1/dmf2l2YV9gT1kF8MISG6Jg/u15LNnAoNEc394BqvBSUM5sTUnBXFo4zlVJamP7qPQxaPiSGZTNTiJEV3KhiuLZE3IGs3l6Y1kHevuSO+Uug1TED8HYGzaV9bMjZ92FqsqK7Ye4uJ+Sn4G0zZpSqkvt9k7lcq0KPaHoCHzYqmV2AjPU0ZbpOxZw==" for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_SPKI" to "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAtWhdJuVW9mfUvXVJtq98P5bOPpwnN2sMUJw1pswt5zc2EBhW5Rfu8f/aNY6LAJYiMDs+AdTNhRlhVok9OuJcGtVIhRgkszPIzfNaXS5mmP6I2Aioo02XsxzjaFEDLp4f1wVjx5k+M6zd+cb4uYM/vubwrvmfxtZgpJWQtcTtv6RdD5U6DysrUiQbP1FEF4FdD8LQbr3u3ou6/oqFTxGW2m14JQ4QWlF1LVhgfBx1ZyGOo040dnNtWpuMGujUl4FjLibUWywKFUy/5FRn4aDYSY5qcXCuykipnPG42KyCkZzgL0OvewDCpleiGyP11peUEHVsxL+edFt9Z9/mRX0JUQIDAQAB" for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_LOCAL_CA_DIR" to "/var/lib/certmonger/local" for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_KEY_TYPE" to "RSA" for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Setting "CERTMONGER_CA_NICKNAME" to "IPA" for child. Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Redirecting stdin to /dev/null, leaving stdout and stderr open for child "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:12 localhost certmonger[39956]: 2025-12-06 08:08:12 [39956] Running enrollment helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:12 localhost certmonger[37540]: 2025-12-06 08:08:12 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:12 localhost certmonger[39956]: Submitting request to "https://ipa.ooo.test/ipa/json". Dec 6 03:08:13 localhost certmonger[39956]: Certificate: "MIIFVTCCA72gAwIBAgIBITANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08uVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4MDgxM1oXDTI3MTIwNzA4MDgxM1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNVBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBALVoXSblVvZn1L11SbavfD+Wzj6cJzdrDFCcNabMLec3NhAYVuUX7vH/2jWOiwCWIjA7PgHUzYUZYVaJPTriXBrVSIUYJLMzyM3zWl0uZpj+iNgIqKNNl7Mc42hRAy6eH9cFY8eZPjOs3fnG+LmDP77m8K75n8bWYKSVkLXE7b+kXQ+VOg8rK1IkGz9RRBeBXQ/C0G697t6Luv6KhU8RltpteCUOEFpRdS1YYHwcdWchjqNONHZzbVqbjBro1JeBYy4m1FssChVMv+RUZ+Gg2EmOanFwrspIqZzxuNisgpGc4C9Dr3sAwqZXohsj9daXlBB1bMS/nnRbfWff5kV9CVECAwEAAaOCAeYwggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEBBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3NwMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwcwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3JsL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHUnxOv38frgD9x62sHLAx7Xm9AKMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1ShNDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIR+5MEO1D8sDxHE2Yhy/egp721cPZTxdFMo8mgaqjwRYo5unPK7t11kkasKGR7trxPOUrEiIxYAVkvifOzM3dYSg4uJu82RN7+N1Oh7pyvOuyCi0iEmzyIHd7OB9m9O1zXX+w1nbuGgk4vxIb7gXBOdC9sGPyTqqUIFo7iALV+1RO1LowPZIW5JvvfXcUgomDUCPGjmScT3AN1wO7MGFGP6MFpm2oe22GmYkrlTCVUYnnwV3llxqNKhVpsxw2UngWekkw/pPBs7dtGg1xtu8xEvRGvF2EVSPjHj77EOeEKmXxI7TXIkZ2PcP1gYv02VLI54bPPKEZdk5SG/QJge12i4oxOujA6pu4IpgeTmiz5sucfZdzVYzHqTEn6mF6TTTfdSI8WVb8z42drW/mu6wASZKcrYJdEo+KLS43dL2sSn3PbNsLbunSkfj5WnjCnBNeXy0klQKdejYimcBTIJblnPGX24TAY9DuK3E6vyN2bVygBzyh/vRoOX+egB6Zj1Ow==" Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Certificate submission still ongoing. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Certificate submission attempt complete. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Child status = 0. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Child output: Dec 6 03:08:13 localhost certmonger[37540]: "-----BEGIN CERTIFICATE----- Dec 6 03:08:13 localhost certmonger[37540]: MIIFVTCCA72gAwIBAgIBITANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:13 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:13 localhost certmonger[37540]: MDgxM1oXDTI3MTIwNzA4MDgxM1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:13 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:13 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBALVoXSblVvZn1L11SbavfD+Wzj6cJzdrDFCc Dec 6 03:08:13 localhost certmonger[37540]: NabMLec3NhAYVuUX7vH/2jWOiwCWIjA7PgHUzYUZYVaJPTriXBrVSIUYJLMzyM3z Dec 6 03:08:13 localhost certmonger[37540]: Wl0uZpj+iNgIqKNNl7Mc42hRAy6eH9cFY8eZPjOs3fnG+LmDP77m8K75n8bWYKSV Dec 6 03:08:13 localhost certmonger[37540]: kLXE7b+kXQ+VOg8rK1IkGz9RRBeBXQ/C0G697t6Luv6KhU8RltpteCUOEFpRdS1Y Dec 6 03:08:13 localhost certmonger[37540]: YHwcdWchjqNONHZzbVqbjBro1JeBYy4m1FssChVMv+RUZ+Gg2EmOanFwrspIqZzx Dec 6 03:08:13 localhost certmonger[37540]: uNisgpGc4C9Dr3sAwqZXohsj9daXlBB1bMS/nnRbfWff5kV9CVECAwEAAaOCAeYw Dec 6 03:08:13 localhost certmonger[37540]: ggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:13 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:13 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:13 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:13 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:13 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHUnxOv38frgD9x62sHLAx7X Dec 6 03:08:13 localhost certmonger[37540]: m9AKMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:13 localhost certmonger[37540]: dGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:13 localhost certmonger[37540]: YXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh Dec 6 03:08:13 localhost certmonger[37540]: NDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:13 localhost certmonger[37540]: LnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIR+5MEO1D8sDxHE2Yhy/egp721cPZTx Dec 6 03:08:13 localhost certmonger[37540]: dFMo8mgaqjwRYo5unPK7t11kkasKGR7trxPOUrEiIxYAVkvifOzM3dYSg4uJu82R Dec 6 03:08:13 localhost certmonger[37540]: N7+N1Oh7pyvOuyCi0iEmzyIHd7OB9m9O1zXX+w1nbuGgk4vxIb7gXBOdC9sGPyTq Dec 6 03:08:13 localhost certmonger[37540]: qUIFo7iALV+1RO1LowPZIW5JvvfXcUgomDUCPGjmScT3AN1wO7MGFGP6MFpm2oe2 Dec 6 03:08:13 localhost certmonger[37540]: 2GmYkrlTCVUYnnwV3llxqNKhVpsxw2UngWekkw/pPBs7dtGg1xtu8xEvRGvF2EVS Dec 6 03:08:13 localhost certmonger[37540]: PjHj77EOeEKmXxI7TXIkZ2PcP1gYv02VLI54bPPKEZdk5SG/QJge12i4oxOujA6p Dec 6 03:08:13 localhost certmonger[37540]: u4IpgeTmiz5sucfZdzVYzHqTEn6mF6TTTfdSI8WVb8z42drW/mu6wASZKcrYJdEo Dec 6 03:08:13 localhost certmonger[37540]: +KLS43dL2sSn3PbNsLbunSkfj5WnjCnBNeXy0klQKdejYimcBTIJblnPGX24TAY9 Dec 6 03:08:13 localhost certmonger[37540]: DuK3E6vyN2bVygBzyh/vRoOX+egB6Zj1Ow== Dec 6 03:08:13 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:13 localhost certmonger[37540]: " Dec 6 03:08:13 localhost certmonger[39958]: 2025-12-06 08:08:13 [39958] Postprocessing output "-----BEGIN CERTIFICATE----- Dec 6 03:08:13 localhost certmonger[39958]: MIIFVTCCA72gAwIBAgIBITANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:13 localhost certmonger[39958]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:13 localhost certmonger[39958]: MDgxM1oXDTI3MTIwNzA4MDgxM1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:13 localhost certmonger[39958]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:13 localhost certmonger[39958]: hvcNAQEBBQADggEPADCCAQoCggEBALVoXSblVvZn1L11SbavfD+Wzj6cJzdrDFCc Dec 6 03:08:13 localhost certmonger[39958]: NabMLec3NhAYVuUX7vH/2jWOiwCWIjA7PgHUzYUZYVaJPTriXBrVSIUYJLMzyM3z Dec 6 03:08:13 localhost certmonger[39958]: Wl0uZpj+iNgIqKNNl7Mc42hRAy6eH9cFY8eZPjOs3fnG+LmDP77m8K75n8bWYKSV Dec 6 03:08:13 localhost certmonger[39958]: kLXE7b+kXQ+VOg8rK1IkGz9RRBeBXQ/C0G697t6Luv6KhU8RltpteCUOEFpRdS1Y Dec 6 03:08:13 localhost certmonger[39958]: YHwcdWchjqNONHZzbVqbjBro1JeBYy4m1FssChVMv+RUZ+Gg2EmOanFwrspIqZzx Dec 6 03:08:13 localhost certmonger[39958]: uNisgpGc4C9Dr3sAwqZXohsj9daXlBB1bMS/nnRbfWff5kV9CVECAwEAAaOCAeYw Dec 6 03:08:13 localhost certmonger[39958]: ggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:13 localhost certmonger[39958]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:13 localhost certmonger[39958]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:13 localhost certmonger[39958]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:13 localhost certmonger[39958]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:13 localhost certmonger[39958]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHUnxOv38frgD9x62sHLAx7X Dec 6 03:08:13 localhost certmonger[39958]: m9AKMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:13 localhost certmonger[39958]: dGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:13 localhost certmonger[39958]: YXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh Dec 6 03:08:13 localhost certmonger[39958]: NDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:13 localhost certmonger[39958]: LnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIR+5MEO1D8sDxHE2Yhy/egp721cPZTx Dec 6 03:08:13 localhost certmonger[39958]: dFMo8mgaqjwRYo5unPK7t11kkasKGR7trxPOUrEiIxYAVkvifOzM3dYSg4uJu82R Dec 6 03:08:13 localhost certmonger[39958]: N7+N1Oh7pyvOuyCi0iEmzyIHd7OB9m9O1zXX+w1nbuGgk4vxIb7gXBOdC9sGPyTq Dec 6 03:08:13 localhost certmonger[39958]: qUIFo7iALV+1RO1LowPZIW5JvvfXcUgomDUCPGjmScT3AN1wO7MGFGP6MFpm2oe2 Dec 6 03:08:13 localhost certmonger[39958]: 2GmYkrlTCVUYnnwV3llxqNKhVpsxw2UngWekkw/pPBs7dtGg1xtu8xEvRGvF2EVS Dec 6 03:08:13 localhost certmonger[39958]: PjHj77EOeEKmXxI7TXIkZ2PcP1gYv02VLI54bPPKEZdk5SG/QJge12i4oxOujA6p Dec 6 03:08:13 localhost certmonger[39958]: u4IpgeTmiz5sucfZdzVYzHqTEn6mF6TTTfdSI8WVb8z42drW/mu6wASZKcrYJdEo Dec 6 03:08:13 localhost certmonger[39958]: +KLS43dL2sSn3PbNsLbunSkfj5WnjCnBNeXy0klQKdejYimcBTIJblnPGX24TAY9 Dec 6 03:08:13 localhost certmonger[39958]: DuK3E6vyN2bVygBzyh/vRoOX+egB6Zj1Ow== Dec 6 03:08:13 localhost certmonger[39958]: -----END CERTIFICATE----- Dec 6 03:08:13 localhost certmonger[39958]: ". Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Certificate submission still ongoing. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Certificate submission postprocessing complete. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Child status = 0. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Child output: Dec 6 03:08:13 localhost certmonger[37540]: "{"certificate":"-----BEGIN CERTIFICATE-----\nMIIFVTCCA72gAwIBAgIBITANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u\nVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4\nMDgxM1oXDTI3MTIwNzA4MDgxM1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV\nBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI\nhvcNAQEBBQADggEPADCCAQoCggEBALVoXSblVvZn1L11SbavfD+Wzj6cJzdrDFCc\nNabMLec3NhAYVuUX7vH/2jWOiwCWIjA7PgHUzYUZYVaJPTriXBrVSIUYJLMzyM3z\nWl0uZpj+iNgIqKNNl7Mc42hRAy6eH9cFY8eZPjOs3fnG+LmDP77m8K75n8bWYKSV\nkLXE7b+kXQ+VOg8rK1IkGz9RRBeBXQ/C0G697t6Luv6KhU8RltpteCUOEFpRdS1Y\nYHwcdWchjqNONHZzbVqbjBro1JeBYy4m1FssChVMv+RUZ+Gg2EmOanFwrspIqZzx\nuNisgpGc4C9Dr3sAwqZXohsj9daXlBB1bMS/nnRbfWff5kV9CVECAwEAAaOCAeYw\nggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB\nBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw\nMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw\ncwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js\nL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD\nZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHUnxOv38frgD9x62sHLAx7X\nm9AKMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u\ndGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs\nYXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh\nNDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v\nLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIR+5MEO1D8sDxHE2Yhy/egp721cPZTx\ndFMo8mgaqjwRYo5unPK7t11kkasKGR7trxPOUrEiIxYAVkvifOzM3dYSg4uJu82R\nN7+N1Oh7pyvOuyCi0iEmzyIHd7OB9m9O1zXX+w1nbuGgk4vxIb7gXBOdC9sGPyTq\nqUIFo7iALV+1RO1LowPZIW5JvvfXcUgomDUCPGjmScT3AN1wO7MGFGP6MFpm2oe2\n2GmYkrlTCVUYnnwV3llxqNKhVpsxw2UngWekkw/pPBs7dtGg1xtu8xEvRGvF2EVS\nPjHj77EOeEKmXxI7TXIkZ2PcP1gYv02VLI54bPPKEZdk5SG/QJge12i4oxOujA6p\nu4IpgeTmiz5sucfZdzVYzHqTEn6mF6TTTfdSI8WVb8z42drW/mu6wASZKcrYJdEo\n+KLS43dL2sSn3PbNsLbunSkfj5WnjCnBNeXy0klQKdejYimcBTIJblnPGX24TAY9\nDuK3E6vyN2bVygBzyh/vRoOX+egB6Zj1Ow==\n-----END CERTIFICATE-----\n","key_checked":true} Dec 6 03:08:13 localhost certmonger[37540]: " Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Issued certificate is "-----BEGIN CERTIFICATE----- Dec 6 03:08:13 localhost certmonger[37540]: MIIFVTCCA72gAwIBAgIBITANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:13 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:13 localhost certmonger[37540]: MDgxM1oXDTI3MTIwNzA4MDgxM1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:13 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:13 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBALVoXSblVvZn1L11SbavfD+Wzj6cJzdrDFCc Dec 6 03:08:13 localhost certmonger[37540]: NabMLec3NhAYVuUX7vH/2jWOiwCWIjA7PgHUzYUZYVaJPTriXBrVSIUYJLMzyM3z Dec 6 03:08:13 localhost certmonger[37540]: Wl0uZpj+iNgIqKNNl7Mc42hRAy6eH9cFY8eZPjOs3fnG+LmDP77m8K75n8bWYKSV Dec 6 03:08:13 localhost certmonger[37540]: kLXE7b+kXQ+VOg8rK1IkGz9RRBeBXQ/C0G697t6Luv6KhU8RltpteCUOEFpRdS1Y Dec 6 03:08:13 localhost certmonger[37540]: YHwcdWchjqNONHZzbVqbjBro1JeBYy4m1FssChVMv+RUZ+Gg2EmOanFwrspIqZzx Dec 6 03:08:13 localhost certmonger[37540]: uNisgpGc4C9Dr3sAwqZXohsj9daXlBB1bMS/nnRbfWff5kV9CVECAwEAAaOCAeYw Dec 6 03:08:13 localhost certmonger[37540]: ggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:13 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:13 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:13 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:13 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:13 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFHUnxOv38frgD9x62sHLAx7X Dec 6 03:08:13 localhost certmonger[37540]: m9AKMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:13 localhost certmonger[37540]: dGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:13 localhost certmonger[37540]: YXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh Dec 6 03:08:13 localhost certmonger[37540]: NDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:13 localhost certmonger[37540]: LnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIR+5MEO1D8sDxHE2Yhy/egp721cPZTx Dec 6 03:08:13 localhost certmonger[37540]: dFMo8mgaqjwRYo5unPK7t11kkasKGR7trxPOUrEiIxYAVkvifOzM3dYSg4uJu82R Dec 6 03:08:13 localhost certmonger[37540]: N7+N1Oh7pyvOuyCi0iEmzyIHd7OB9m9O1zXX+w1nbuGgk4vxIb7gXBOdC9sGPyTq Dec 6 03:08:13 localhost certmonger[37540]: qUIFo7iALV+1RO1LowPZIW5JvvfXcUgomDUCPGjmScT3AN1wO7MGFGP6MFpm2oe2 Dec 6 03:08:13 localhost certmonger[37540]: 2GmYkrlTCVUYnnwV3llxqNKhVpsxw2UngWekkw/pPBs7dtGg1xtu8xEvRGvF2EVS Dec 6 03:08:13 localhost certmonger[37540]: PjHj77EOeEKmXxI7TXIkZ2PcP1gYv02VLI54bPPKEZdk5SG/QJge12i4oxOujA6p Dec 6 03:08:13 localhost certmonger[37540]: u4IpgeTmiz5sucfZdzVYzHqTEn6mF6TTTfdSI8WVb8z42drW/mu6wASZKcrYJdEo Dec 6 03:08:13 localhost certmonger[37540]: +KLS43dL2sSn3PbNsLbunSkfj5WnjCnBNeXy0klQKdejYimcBTIJblnPGX24TAY9 Dec 6 03:08:13 localhost certmonger[37540]: DuK3E6vyN2bVygBzyh/vRoOX+egB6Zj1Ow== Dec 6 03:08:13 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:13 localhost certmonger[37540]: ". Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Certificate issued (0 chain certificates, 0 roots). Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] No hooks set for pre-save command. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:13 localhost certmonger[39971]: Certificate in file "/etc/pki/tls/certs/qemu-server-cert.crt" issued by CA and saved. Dec 6 03:08:13 localhost certmonger[37540]: 2025-12-06 08:08:13 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 03:08:14 localhost python3[39987]: ansible-certificate_request Invoked with name=qemu-client-cert dns=['np0005548798.internalapi.ooo.test'] owner=root group=qemu principal=['qemu/np0005548798.internalapi.ooo.test@OOO.TEST'] directory=/etc/pki/tls key_size=2048 wait=True run_after=# Copy cert and key to qemu dir#012cp -a /etc/pki/tls/certs/qemu-client-cert.crt /etc/pki/qemu/client-cert.pem#012cp -a /etc/pki/tls/private/qemu-client-cert.key /etc/pki/qemu/client-key.pem#012chgrp qemu /etc/pki/qemu/client-*#012chmod 0640 /etc/pki/qemu/client-cert.pem#012chmod 0640 /etc/pki/qemu/client-key.pem#012 ca=ipa __header=##012# Ansible managed#012##012 provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None run_before=None Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_REQ_SUBJECT" to "CN=np0005548798.internalapi.ooo.test" for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_REQ_HOSTNAME" to "np0005548798.internalapi.ooo.test Dec 6 03:08:14 localhost certmonger[39997]: " for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_REQ_PRINCIPAL" to "qemu/np0005548798.internalapi.ooo.test@OOO.TEST Dec 6 03:08:14 localhost certmonger[39997]: " for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_OPERATION" to "SUBMIT" for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_CSR" to "-----BEGIN CERTIFICATE REQUEST----- Dec 6 03:08:14 localhost certmonger[39997]: MIID0jCCAroCAQAwLDEqMCgGA1UEAxMhbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBp Dec 6 03:08:14 localhost certmonger[39997]: Lm9vby50ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwLPZ9ycQ Dec 6 03:08:14 localhost certmonger[39997]: MNP90RjO4BhlzRCtALwFHmCmRuSFCTWkFNBjC+TDEQr7SbcFo4q4HuVcHuQnC7i8 Dec 6 03:08:14 localhost certmonger[39997]: 3XJLHjo7vesWRBlc6tjdsBfyOxAQ4BloRlTYf731e9Z3Pf7Zx0mJqQZ3voEm35J7 Dec 6 03:08:14 localhost certmonger[39997]: jb9CDRd/Dl/VqzVp9jo+v4DSLAF6jrlBTaPzpAdpy1UU2uHC4u4wRSpwv/MNGJZm Dec 6 03:08:14 localhost certmonger[39997]: 64SYmECYmqk+ju/qrcOsnxN+eCa4FzMO5xIe5Jd5y2CMdHYVPKM0Cug/57bV5W8B Dec 6 03:08:14 localhost certmonger[39997]: K/4ZEuEy5zcq2gB/6fSb5H736UkT/au8K7BaX5snm6+WhlOoqB7DaXj62Edn8j1N Dec 6 03:08:14 localhost certmonger[39997]: M/LkyXhA1rmStQIDAQABoIIBXzArBgkqhkiG9w0BCRQxHh4cADIAMAAyADUAMQAy Dec 6 03:08:14 localhost certmonger[39997]: ADAANgAwADgAMAA4ADEANDCCAS4GCSqGSIb3DQEJDjGCAR8wggEbMAsGA1UdDwQE Dec 6 03:08:14 localhost certmonger[39997]: AwIFoDCBvwYDVR0RBIG3MIG0giFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:14 localhost certmonger[39997]: LnRlc3SgPwYKKwYBBAGCNxQCA6AxDC9xZW11L25wMDAwNTU0ODc5OC5pbnRlcm5h Dec 6 03:08:14 localhost certmonger[39997]: bGFwaS5vb28udGVzdEBPT08uVEVTVKBOBgYrBgEFAgKgRDBCoAobCE9PTy5URVNU Dec 6 03:08:14 localhost certmonger[39997]: oTQwMqADAgEBoSswKRsEcWVtdRshbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBpLm9v Dec 6 03:08:14 localhost certmonger[39997]: by50ZXN0MB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAMBgNVHRMBAf8E Dec 6 03:08:14 localhost certmonger[39997]: AjAAMB0GA1UdDgQWBBSb7GhteGdD+79S82DaGPJgdg22rjANBgkqhkiG9w0BAQsF Dec 6 03:08:14 localhost certmonger[39997]: AAOCAQEAULkfl/RyilSnDlFwRibMM2De+CwLUv+fK8H5CjVcJl+rOBPsrpGVvR9z Dec 6 03:08:14 localhost certmonger[39997]: mUnmYZZHzZvw7mlzENvGBcaqprsOkKg/vz/Cswvxx1hAYmDur08+CaCve4tj5phV Dec 6 03:08:14 localhost certmonger[39997]: XEJuRdCSkn2wGH90oh4FFPtJk+CvmWKNIAIbugqz2E972661RFNskM5JoPfY5mfh Dec 6 03:08:14 localhost certmonger[39997]: fRo+5TL45J+5hOrP1bMT33g1dQMzNlRCX3mjwnGuM8hxgFSI1OjlWWl4Et8y/gHM Dec 6 03:08:14 localhost certmonger[39997]: OeR3GN8RkhufDmx4sY0K1cjM+auUAbefDnI4b1SI2sT65QKtHGlTChJH63V4fapF Dec 6 03:08:14 localhost certmonger[39997]: Z8KDlyrghePFV/hcKrCVXHuvVv5IXQ== Dec 6 03:08:14 localhost certmonger[39997]: -----END CERTIFICATE REQUEST----- Dec 6 03:08:14 localhost certmonger[39997]: " for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_SPKAC" to "MIICQDCCASgwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDAs9n3JxAw0/3RGM7gGGXNEK0AvAUeYKZG5IUJNaQU0GML5MMRCvtJtwWjirge5Vwe5CcLuLzdckseOju96xZEGVzq2N2wF/I7EBDgGWhGVNh/vfV71nc9/tnHSYmpBne+gSbfknuNv0INF38OX9WrNWn2Oj6/gNIsAXqOuUFNo/OkB2nLVRTa4cLi7jBFKnC/8w0YlmbrhJiYQJiaqT6O7+qtw6yfE354JrgXMw7nEh7kl3nLYIx0dhU8ozQK6D/nttXlbwEr/hkS4TLnNyraAH/p9JvkfvfpSRP9q7wrsFpfmyebr5aGU6ioHsNpePrYR2fyPU0z8uTJeEDWuZK1AgMBAAEWADANBgkqhkiG9w0BAQsFAAOCAQEAZb4grm1/yIxK+UU8udU7hes8kUNrntQgCoBRTCpPPza4iypB/YKxLg55LOI1kdzIFww98ejwMgTnjprQJW25/6Iaidaikzeh3deBqmeS/CaVaDUIRPB45Rg+PN1OI6jSOtGZzNmJIc2x7M754P83BRbtLrkg9Dl06avzExotRhrwOKvRp42qgELOKs7EAVGQPariTyCIoOKsrkB9pjSfbRq72F7XQ3P7irDXlnRN6WYEn7t4yGDzvyhrwabFmBtTDQUqepzRmF25P//O8tSbzoMS3MiyxBWYCY5ty1oXzo25ZtgVuAPJx6Ckd2dB5jothZEw6UW5caeasKp2RXu+Nw==" for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_SPKI" to "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwLPZ9ycQMNP90RjO4BhlzRCtALwFHmCmRuSFCTWkFNBjC+TDEQr7SbcFo4q4HuVcHuQnC7i83XJLHjo7vesWRBlc6tjdsBfyOxAQ4BloRlTYf731e9Z3Pf7Zx0mJqQZ3voEm35J7jb9CDRd/Dl/VqzVp9jo+v4DSLAF6jrlBTaPzpAdpy1UU2uHC4u4wRSpwv/MNGJZm64SYmECYmqk+ju/qrcOsnxN+eCa4FzMO5xIe5Jd5y2CMdHYVPKM0Cug/57bV5W8BK/4ZEuEy5zcq2gB/6fSb5H736UkT/au8K7BaX5snm6+WhlOoqB7DaXj62Edn8j1NM/LkyXhA1rmStQIDAQAB" for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_LOCAL_CA_DIR" to "/var/lib/certmonger/local" for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_KEY_TYPE" to "RSA" for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Setting "CERTMONGER_CA_NICKNAME" to "IPA" for child. Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Redirecting stdin to /dev/null, leaving stdout and stderr open for child "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:14 localhost certmonger[39997]: 2025-12-06 08:08:14 [39997] Running enrollment helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[39997]: Submitting request to "https://ipa.ooo.test/ipa/json". Dec 6 03:08:14 localhost certmonger[39997]: Certificate: "MIIFVTCCA72gAwIBAgIBJDANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08uVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4MDgxNFoXDTI3MTIwNzA4MDgxNFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNVBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAMCz2fcnEDDT/dEYzuAYZc0QrQC8BR5gpkbkhQk1pBTQYwvkwxEK+0m3BaOKuB7lXB7kJwu4vN1ySx46O73rFkQZXOrY3bAX8jsQEOAZaEZU2H+99XvWdz3+2cdJiakGd76BJt+Se42/Qg0Xfw5f1as1afY6Pr+A0iwBeo65QU2j86QHactVFNrhwuLuMEUqcL/zDRiWZuuEmJhAmJqpPo7v6q3DrJ8TfngmuBczDucSHuSXectgjHR2FTyjNAroP+e21eVvASv+GRLhMuc3KtoAf+n0m+R+9+lJE/2rvCuwWl+bJ5uvloZTqKgew2l4+thHZ/I9TTPy5Ml4QNa5krUCAwEAAaOCAeYwggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEBBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3NwMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwcwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3JsL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFCL3+8m2OkBaWksqpsIoAmBMbSmbMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1ShNDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29vLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIy6R/g1MqX7OFWOpibDvuw6WqFyO1ocFfcny6ytyi2vI8W8DAYCRduvfq5M5Nd+6yuT6a8NQYZ4MDYk1Z0Gw+uv41tg4T8hoMzXb8dnwc4KAWMsuP7kA+i0BDCMKOnurEwmDAhnyX0l4fIQU5q5bRC76kHyzc4leHwzLEuAKsuYiAcP4v24JS6biqvOTN1SB8OgZFf321jiXA4Mhgh1t07zcCK1wLG4+WXmZ5PXtIDb53B3PTs1DNQic2R6xFD3y//+nfzMF+9mc72B5jqhEQWD3dxrjNUNPbfqewJUfI6PSyAx46jKBSfXTLFfXhwsBgxQCWGBEfgj/sHjjfBUwDg+t3VU3QJewCOSOYzyZgUknoHHKLS6osBnL1cNRQxjdrIScSINMaKjqyyf0vgL5xaix1xVfXpYHTj/X3Qi1h+6wgMO6gP0QteqOXNYnxtjs11HgT2a/3KfS0A58yO4rIKHM6csvOoSoUIR5PQ4JaQhCUFjSTJzNjZ1pSqvpD/bTw==" Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Certificate submission still ongoing. Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Certificate submission attempt complete. Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Child status = 0. Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Child output: Dec 6 03:08:14 localhost certmonger[37540]: "-----BEGIN CERTIFICATE----- Dec 6 03:08:14 localhost certmonger[37540]: MIIFVTCCA72gAwIBAgIBJDANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:14 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:14 localhost certmonger[37540]: MDgxNFoXDTI3MTIwNzA4MDgxNFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:14 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:14 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAMCz2fcnEDDT/dEYzuAYZc0QrQC8BR5gpkbk Dec 6 03:08:14 localhost certmonger[37540]: hQk1pBTQYwvkwxEK+0m3BaOKuB7lXB7kJwu4vN1ySx46O73rFkQZXOrY3bAX8jsQ Dec 6 03:08:14 localhost certmonger[37540]: EOAZaEZU2H+99XvWdz3+2cdJiakGd76BJt+Se42/Qg0Xfw5f1as1afY6Pr+A0iwB Dec 6 03:08:14 localhost certmonger[37540]: eo65QU2j86QHactVFNrhwuLuMEUqcL/zDRiWZuuEmJhAmJqpPo7v6q3DrJ8Tfngm Dec 6 03:08:14 localhost certmonger[37540]: uBczDucSHuSXectgjHR2FTyjNAroP+e21eVvASv+GRLhMuc3KtoAf+n0m+R+9+lJ Dec 6 03:08:14 localhost certmonger[37540]: E/2rvCuwWl+bJ5uvloZTqKgew2l4+thHZ/I9TTPy5Ml4QNa5krUCAwEAAaOCAeYw Dec 6 03:08:14 localhost certmonger[37540]: ggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:14 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:14 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:14 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:14 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:14 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFCL3+8m2OkBaWksqpsIoAmBM Dec 6 03:08:14 localhost certmonger[37540]: bSmbMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:14 localhost certmonger[37540]: dGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:14 localhost certmonger[37540]: YXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh Dec 6 03:08:14 localhost certmonger[37540]: NDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:14 localhost certmonger[37540]: LnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIy6R/g1MqX7OFWOpibDvuw6WqFyO1oc Dec 6 03:08:14 localhost certmonger[37540]: Ffcny6ytyi2vI8W8DAYCRduvfq5M5Nd+6yuT6a8NQYZ4MDYk1Z0Gw+uv41tg4T8h Dec 6 03:08:14 localhost certmonger[37540]: oMzXb8dnwc4KAWMsuP7kA+i0BDCMKOnurEwmDAhnyX0l4fIQU5q5bRC76kHyzc4l Dec 6 03:08:14 localhost certmonger[37540]: eHwzLEuAKsuYiAcP4v24JS6biqvOTN1SB8OgZFf321jiXA4Mhgh1t07zcCK1wLG4 Dec 6 03:08:14 localhost certmonger[37540]: +WXmZ5PXtIDb53B3PTs1DNQic2R6xFD3y//+nfzMF+9mc72B5jqhEQWD3dxrjNUN Dec 6 03:08:14 localhost certmonger[37540]: PbfqewJUfI6PSyAx46jKBSfXTLFfXhwsBgxQCWGBEfgj/sHjjfBUwDg+t3VU3QJe Dec 6 03:08:14 localhost certmonger[37540]: wCOSOYzyZgUknoHHKLS6osBnL1cNRQxjdrIScSINMaKjqyyf0vgL5xaix1xVfXpY Dec 6 03:08:14 localhost certmonger[37540]: HTj/X3Qi1h+6wgMO6gP0QteqOXNYnxtjs11HgT2a/3KfS0A58yO4rIKHM6csvOoS Dec 6 03:08:14 localhost certmonger[37540]: oUIR5PQ4JaQhCUFjSTJzNjZ1pSqvpD/bTw== Dec 6 03:08:14 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:14 localhost certmonger[37540]: " Dec 6 03:08:14 localhost certmonger[39999]: 2025-12-06 08:08:14 [39999] Postprocessing output "-----BEGIN CERTIFICATE----- Dec 6 03:08:14 localhost certmonger[39999]: MIIFVTCCA72gAwIBAgIBJDANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:14 localhost certmonger[39999]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:14 localhost certmonger[39999]: MDgxNFoXDTI3MTIwNzA4MDgxNFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:14 localhost certmonger[39999]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:14 localhost certmonger[39999]: hvcNAQEBBQADggEPADCCAQoCggEBAMCz2fcnEDDT/dEYzuAYZc0QrQC8BR5gpkbk Dec 6 03:08:14 localhost certmonger[39999]: hQk1pBTQYwvkwxEK+0m3BaOKuB7lXB7kJwu4vN1ySx46O73rFkQZXOrY3bAX8jsQ Dec 6 03:08:14 localhost certmonger[39999]: EOAZaEZU2H+99XvWdz3+2cdJiakGd76BJt+Se42/Qg0Xfw5f1as1afY6Pr+A0iwB Dec 6 03:08:14 localhost certmonger[39999]: eo65QU2j86QHactVFNrhwuLuMEUqcL/zDRiWZuuEmJhAmJqpPo7v6q3DrJ8Tfngm Dec 6 03:08:14 localhost certmonger[39999]: uBczDucSHuSXectgjHR2FTyjNAroP+e21eVvASv+GRLhMuc3KtoAf+n0m+R+9+lJ Dec 6 03:08:14 localhost certmonger[39999]: E/2rvCuwWl+bJ5uvloZTqKgew2l4+thHZ/I9TTPy5Ml4QNa5krUCAwEAAaOCAeYw Dec 6 03:08:14 localhost certmonger[39999]: ggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:14 localhost certmonger[39999]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:14 localhost certmonger[39999]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:14 localhost certmonger[39999]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:14 localhost certmonger[39999]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:14 localhost certmonger[39999]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFCL3+8m2OkBaWksqpsIoAmBM Dec 6 03:08:14 localhost certmonger[39999]: bSmbMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:14 localhost certmonger[39999]: dGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:14 localhost certmonger[39999]: YXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh Dec 6 03:08:14 localhost certmonger[39999]: NDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:14 localhost certmonger[39999]: LnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIy6R/g1MqX7OFWOpibDvuw6WqFyO1oc Dec 6 03:08:14 localhost certmonger[39999]: Ffcny6ytyi2vI8W8DAYCRduvfq5M5Nd+6yuT6a8NQYZ4MDYk1Z0Gw+uv41tg4T8h Dec 6 03:08:14 localhost certmonger[39999]: oMzXb8dnwc4KAWMsuP7kA+i0BDCMKOnurEwmDAhnyX0l4fIQU5q5bRC76kHyzc4l Dec 6 03:08:14 localhost certmonger[39999]: eHwzLEuAKsuYiAcP4v24JS6biqvOTN1SB8OgZFf321jiXA4Mhgh1t07zcCK1wLG4 Dec 6 03:08:14 localhost certmonger[39999]: +WXmZ5PXtIDb53B3PTs1DNQic2R6xFD3y//+nfzMF+9mc72B5jqhEQWD3dxrjNUN Dec 6 03:08:14 localhost certmonger[39999]: PbfqewJUfI6PSyAx46jKBSfXTLFfXhwsBgxQCWGBEfgj/sHjjfBUwDg+t3VU3QJe Dec 6 03:08:14 localhost certmonger[39999]: wCOSOYzyZgUknoHHKLS6osBnL1cNRQxjdrIScSINMaKjqyyf0vgL5xaix1xVfXpY Dec 6 03:08:14 localhost certmonger[39999]: HTj/X3Qi1h+6wgMO6gP0QteqOXNYnxtjs11HgT2a/3KfS0A58yO4rIKHM6csvOoS Dec 6 03:08:14 localhost certmonger[39999]: oUIR5PQ4JaQhCUFjSTJzNjZ1pSqvpD/bTw== Dec 6 03:08:14 localhost certmonger[39999]: -----END CERTIFICATE----- Dec 6 03:08:14 localhost certmonger[39999]: ". Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Certificate submission still ongoing. Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Certificate submission postprocessing complete. Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Child status = 0. Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Child output: Dec 6 03:08:14 localhost certmonger[37540]: "{"certificate":"-----BEGIN CERTIFICATE-----\nMIIFVTCCA72gAwIBAgIBJDANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u\nVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4\nMDgxNFoXDTI3MTIwNzA4MDgxNFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV\nBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI\nhvcNAQEBBQADggEPADCCAQoCggEBAMCz2fcnEDDT/dEYzuAYZc0QrQC8BR5gpkbk\nhQk1pBTQYwvkwxEK+0m3BaOKuB7lXB7kJwu4vN1ySx46O73rFkQZXOrY3bAX8jsQ\nEOAZaEZU2H+99XvWdz3+2cdJiakGd76BJt+Se42/Qg0Xfw5f1as1afY6Pr+A0iwB\neo65QU2j86QHactVFNrhwuLuMEUqcL/zDRiWZuuEmJhAmJqpPo7v6q3DrJ8Tfngm\nuBczDucSHuSXectgjHR2FTyjNAroP+e21eVvASv+GRLhMuc3KtoAf+n0m+R+9+lJ\nE/2rvCuwWl+bJ5uvloZTqKgew2l4+thHZ/I9TTPy5Ml4QNa5krUCAwEAAaOCAeYw\nggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB\nBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw\nMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw\ncwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js\nL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD\nZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFCL3+8m2OkBaWksqpsIoAmBM\nbSmbMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u\ndGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs\nYXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh\nNDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v\nLnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIy6R/g1MqX7OFWOpibDvuw6WqFyO1oc\nFfcny6ytyi2vI8W8DAYCRduvfq5M5Nd+6yuT6a8NQYZ4MDYk1Z0Gw+uv41tg4T8h\noMzXb8dnwc4KAWMsuP7kA+i0BDCMKOnurEwmDAhnyX0l4fIQU5q5bRC76kHyzc4l\neHwzLEuAKsuYiAcP4v24JS6biqvOTN1SB8OgZFf321jiXA4Mhgh1t07zcCK1wLG4\n+WXmZ5PXtIDb53B3PTs1DNQic2R6xFD3y//+nfzMF+9mc72B5jqhEQWD3dxrjNUN\nPbfqewJUfI6PSyAx46jKBSfXTLFfXhwsBgxQCWGBEfgj/sHjjfBUwDg+t3VU3QJe\nwCOSOYzyZgUknoHHKLS6osBnL1cNRQxjdrIScSINMaKjqyyf0vgL5xaix1xVfXpY\nHTj/X3Qi1h+6wgMO6gP0QteqOXNYnxtjs11HgT2a/3KfS0A58yO4rIKHM6csvOoS\noUIR5PQ4JaQhCUFjSTJzNjZ1pSqvpD/bTw==\n-----END CERTIFICATE-----\n","key_checked":true} Dec 6 03:08:14 localhost certmonger[37540]: " Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Issued certificate is "-----BEGIN CERTIFICATE----- Dec 6 03:08:14 localhost certmonger[37540]: MIIFVTCCA72gAwIBAgIBJDANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:08:14 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:08:14 localhost certmonger[37540]: MDgxNFoXDTI3MTIwNzA4MDgxNFowPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:08:14 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:08:14 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAMCz2fcnEDDT/dEYzuAYZc0QrQC8BR5gpkbk Dec 6 03:08:14 localhost certmonger[37540]: hQk1pBTQYwvkwxEK+0m3BaOKuB7lXB7kJwu4vN1ySx46O73rFkQZXOrY3bAX8jsQ Dec 6 03:08:14 localhost certmonger[37540]: EOAZaEZU2H+99XvWdz3+2cdJiakGd76BJt+Se42/Qg0Xfw5f1as1afY6Pr+A0iwB Dec 6 03:08:14 localhost certmonger[37540]: eo65QU2j86QHactVFNrhwuLuMEUqcL/zDRiWZuuEmJhAmJqpPo7v6q3DrJ8Tfngm Dec 6 03:08:14 localhost certmonger[37540]: uBczDucSHuSXectgjHR2FTyjNAroP+e21eVvASv+GRLhMuc3KtoAf+n0m+R+9+lJ Dec 6 03:08:14 localhost certmonger[37540]: E/2rvCuwWl+bJ5uvloZTqKgew2l4+thHZ/I9TTPy5Ml4QNa5krUCAwEAAaOCAeYw Dec 6 03:08:14 localhost certmonger[37540]: ggHiMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:08:14 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:08:14 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:08:14 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:08:14 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:08:14 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFCL3+8m2OkBaWksqpsIoAmBM Dec 6 03:08:14 localhost certmonger[37540]: bSmbMIG/BgNVHREEgbcwgbSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:08:14 localhost certmonger[37540]: dGVzdKA/BgorBgEEAYI3FAIDoDEML3FlbXUvbnAwMDA1NTQ4Nzk4LmludGVybmFs Dec 6 03:08:14 localhost certmonger[37540]: YXBpLm9vby50ZXN0QE9PTy5URVNUoE4GBisGAQUCAqBEMEKgChsIT09PLlRFU1Sh Dec 6 03:08:14 localhost certmonger[37540]: NDAyoAMCAQGhKzApGwRxZW11GyFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:08:14 localhost certmonger[37540]: LnRlc3QwDQYJKoZIhvcNAQELBQADggGBAIy6R/g1MqX7OFWOpibDvuw6WqFyO1oc Dec 6 03:08:14 localhost certmonger[37540]: Ffcny6ytyi2vI8W8DAYCRduvfq5M5Nd+6yuT6a8NQYZ4MDYk1Z0Gw+uv41tg4T8h Dec 6 03:08:14 localhost certmonger[37540]: oMzXb8dnwc4KAWMsuP7kA+i0BDCMKOnurEwmDAhnyX0l4fIQU5q5bRC76kHyzc4l Dec 6 03:08:14 localhost certmonger[37540]: eHwzLEuAKsuYiAcP4v24JS6biqvOTN1SB8OgZFf321jiXA4Mhgh1t07zcCK1wLG4 Dec 6 03:08:14 localhost certmonger[37540]: +WXmZ5PXtIDb53B3PTs1DNQic2R6xFD3y//+nfzMF+9mc72B5jqhEQWD3dxrjNUN Dec 6 03:08:14 localhost certmonger[37540]: PbfqewJUfI6PSyAx46jKBSfXTLFfXhwsBgxQCWGBEfgj/sHjjfBUwDg+t3VU3QJe Dec 6 03:08:14 localhost certmonger[37540]: wCOSOYzyZgUknoHHKLS6osBnL1cNRQxjdrIScSINMaKjqyyf0vgL5xaix1xVfXpY Dec 6 03:08:14 localhost certmonger[37540]: HTj/X3Qi1h+6wgMO6gP0QteqOXNYnxtjs11HgT2a/3KfS0A58yO4rIKHM6csvOoS Dec 6 03:08:14 localhost certmonger[37540]: oUIR5PQ4JaQhCUFjSTJzNjZ1pSqvpD/bTw== Dec 6 03:08:14 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:08:14 localhost certmonger[37540]: ". Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Certificate issued (0 chain certificates, 0 roots). Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] No hooks set for pre-save command. Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:14 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:15 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:15 localhost certmonger[37540]: 2025-12-06 08:08:14 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:15 localhost certmonger[40009]: Certificate in file "/etc/pki/tls/certs/qemu-client-cert.crt" issued by CA and saved. Dec 6 03:08:15 localhost certmonger[37540]: 2025-12-06 08:08:15 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 03:08:15 localhost python3[40025]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:08:15 localhost python3[40025]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Dec 6 03:08:15 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:15 localhost python3[40025]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Dec 6 03:08:19 localhost kernel: VFS: idmapped mount is not enabled. Dec 6 03:08:23 localhost podman[40037]: 2025-12-06 08:08:15.82598891 +0000 UTC m=+0.044048719 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 6 03:08:23 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:23 localhost python3[40025]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json Dec 6 03:08:23 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:23 localhost python3[40139]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:08:23 localhost python3[40139]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Dec 6 03:08:23 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:23 localhost python3[40139]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Dec 6 03:08:30 localhost podman[40151]: 2025-12-06 08:08:23.618447688 +0000 UTC m=+0.044632666 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 6 03:08:30 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:31 localhost python3[40139]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json Dec 6 03:08:31 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:31 localhost python3[40253]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:08:31 localhost python3[40253]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Dec 6 03:08:31 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:31 localhost python3[40253]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Dec 6 03:08:50 localhost podman[40266]: 2025-12-06 08:08:31.546003073 +0000 UTC m=+0.043511575 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:08:50 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:50 localhost python3[40253]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json Dec 6 03:08:50 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:50 localhost python3[41285]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:08:50 localhost python3[41285]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Dec 6 03:08:50 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:08:50 localhost python3[41285]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Dec 6 03:09:02 localhost podman[41298]: 2025-12-06 08:08:51.005999545 +0000 UTC m=+0.038490776 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:09:02 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:02 localhost python3[41285]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json Dec 6 03:09:02 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:02 localhost python3[41377]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:09:02 localhost python3[41377]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Dec 6 03:09:02 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:02 localhost python3[41377]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Dec 6 03:09:06 localhost podman[41389]: 2025-12-06 08:09:02.616156408 +0000 UTC m=+0.047240192 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 6 03:09:06 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:06 localhost python3[41377]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json Dec 6 03:09:06 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:06 localhost python3[41479]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:09:06 localhost python3[41479]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Dec 6 03:09:06 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:06 localhost python3[41479]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Dec 6 03:09:12 localhost podman[41491]: 2025-12-06 08:09:06.646231915 +0000 UTC m=+0.037083678 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 6 03:09:12 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:12 localhost python3[41479]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json Dec 6 03:09:12 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:12 localhost python3[41568]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:09:12 localhost python3[41568]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Dec 6 03:09:12 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:12 localhost python3[41568]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Dec 6 03:09:15 localhost podman[41580]: 2025-12-06 08:09:12.934570111 +0000 UTC m=+0.046983373 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 6 03:09:15 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:15 localhost python3[41568]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json Dec 6 03:09:15 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:16 localhost python3[41656]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:09:16 localhost python3[41656]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Dec 6 03:09:16 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:16 localhost python3[41656]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Dec 6 03:09:21 localhost podman[41703]: 2025-12-06 08:09:16.214326948 +0000 UTC m=+0.035901330 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 6 03:09:21 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:21 localhost python3[41656]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json Dec 6 03:09:22 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:22 localhost python3[41779]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:09:22 localhost python3[41779]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Dec 6 03:09:22 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:22 localhost python3[41779]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Dec 6 03:09:24 localhost podman[41792]: 2025-12-06 08:09:22.475356739 +0000 UTC m=+0.047341775 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Dec 6 03:09:24 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:24 localhost python3[41779]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json Dec 6 03:09:24 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:25 localhost python3[41869]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:09:25 localhost python3[41869]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Dec 6 03:09:25 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:25 localhost python3[41869]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Dec 6 03:09:29 localhost podman[41881]: 2025-12-06 08:09:25.404642877 +0000 UTC m=+0.044738480 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 6 03:09:29 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:29 localhost python3[41869]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json Dec 6 03:09:29 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:29 localhost python3[41970]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 6 03:09:29 localhost python3[41970]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Dec 6 03:09:29 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:29 localhost python3[41970]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Dec 6 03:09:33 localhost podman[41983]: 2025-12-06 08:09:29.600921271 +0000 UTC m=+0.039414685 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 6 03:09:33 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:33 localhost python3[41970]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json Dec 6 03:09:33 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:09:34 localhost python3[42059]: ansible-setup Invoked with gather_subset=['min'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 03:09:35 localhost python3[42079]: ansible-ansible.legacy.dnf Invoked with name=['python3-pyasn1', 'python3-cryptography', 'python3-dbus'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:09:38 localhost python3[42096]: ansible-ansible.legacy.dnf Invoked with name=['certmonger'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:09:41 localhost python3[42113]: ansible-file Invoked with name=/etc/certmonger//pre-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//pre-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:09:41 localhost python3[42130]: ansible-file Invoked with name=/etc/certmonger//post-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//post-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:09:42 localhost python3[42146]: ansible-ansible.legacy.systemd Invoked with name=certmonger state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:09:42 localhost python3[42164]: ansible-certificate_request Invoked with name=ovn_controller dns=['np0005548798.internalapi.ooo.test'] principal=['ovn_controller/np0005548798.internalapi.ooo.test@OOO.TEST'] directory=/etc/pki/tls key_size=2048 wait=True ca=ipa __header=##012# Ansible managed#012##012 provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None owner=None group=None run_before=None run_after=None Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_REQ_SUBJECT" to "CN=np0005548798.internalapi.ooo.test" for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_REQ_HOSTNAME" to "np0005548798.internalapi.ooo.test Dec 6 03:09:43 localhost certmonger[42174]: " for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_REQ_PRINCIPAL" to "ovn_controller/np0005548798.internalapi.ooo.test@OOO.TEST Dec 6 03:09:43 localhost certmonger[42174]: " for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_OPERATION" to "SUBMIT" for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_CSR" to "-----BEGIN CERTIFICATE REQUEST----- Dec 6 03:09:43 localhost certmonger[42174]: MIID5jCCAs4CAQAwLDEqMCgGA1UEAxMhbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBp Dec 6 03:09:43 localhost certmonger[42174]: Lm9vby50ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA1vei0EKj Dec 6 03:09:43 localhost certmonger[42174]: js+ck+52P8iks4GZxHbznxxxd/H7zRVki+Ek1zSebmNUya0LZ+jFX7/upGS61x4H Dec 6 03:09:43 localhost certmonger[42174]: tvMf/zWcG0+ZPaSE6n5MMjQiveoKAJ8sQh9eDdMdM9jZGmeoFHqftu+QWAaIa82I Dec 6 03:09:43 localhost certmonger[42174]: FCL3JBBqsmsUxFagW+dHIGSVY5U54NBvlqRq2k2HbX5ViSIaNQcncuO+G90KV9yX Dec 6 03:09:43 localhost certmonger[42174]: sGFDHauDDXfWKDmbsaGb++3gh1jAqGYQguqXL1Wy6cuj0OvQueF4FwljBaHEKOyz Dec 6 03:09:43 localhost certmonger[42174]: SpXw4N/fajx0fHmhQsjHX3wJ+YHa/yvdpIDpcrJ4qXPaVktZiwdYgi4wFJp/gqWO Dec 6 03:09:43 localhost certmonger[42174]: Xu8bmDExlLnicwIDAQABoIIBczArBgkqhkiG9w0BCRQxHh4cADIAMAAyADUAMQAy Dec 6 03:09:43 localhost certmonger[42174]: ADAANgAwADgAMAA5ADQAMzCCAUIGCSqGSIb3DQEJDjGCATMwggEvMAsGA1UdDwQE Dec 6 03:09:43 localhost certmonger[42174]: AwIFoDCB0wYDVR0RBIHLMIHIgiFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:09:43 localhost certmonger[42174]: LnRlc3SgSQYKKwYBBAGCNxQCA6A7DDlvdm5fY29udHJvbGxlci9ucDAwMDU1NDg3 Dec 6 03:09:43 localhost certmonger[42174]: OTguaW50ZXJuYWxhcGkub29vLnRlc3RAT09PLlRFU1SgWAYGKwYBBQICoE4wTKAK Dec 6 03:09:43 localhost certmonger[42174]: GwhPT08uVEVTVKE+MDygAwIBAaE1MDMbDm92bl9jb250cm9sbGVyGyFucDAwMDU1 Dec 6 03:09:43 localhost certmonger[42174]: NDg3OTguaW50ZXJuYWxhcGkub29vLnRlc3QwHQYDVR0lBBYwFAYIKwYBBQUHAwEG Dec 6 03:09:43 localhost certmonger[42174]: CCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFM+eyYvkOKRtsrK16KSj Dec 6 03:09:43 localhost certmonger[42174]: aOs0kYh5MA0GCSqGSIb3DQEBCwUAA4IBAQCVlvGF67yeuHk4hUATaE2ZKxHxhWf+ Dec 6 03:09:43 localhost certmonger[42174]: mbvCFaHS3lzwOpQhJqlZWdB8nv6OaAuqVPxyaXIToAJJaMcLgX5pPbDHhpF9I7UC Dec 6 03:09:43 localhost certmonger[42174]: KcA4eZ4oM2f3nu2W54gEb5CSc7xZ8krVkY/6L/VPOP3eSfE6wX3xcwPWFNu2/OrY Dec 6 03:09:43 localhost certmonger[42174]: HPnF7JAxtzWs7LL7vOWXlp25vepRY6zhQx+DgJsT6Sc/gto5o+eBV3B1G9xZ2RGX Dec 6 03:09:43 localhost certmonger[42174]: dsZpZq97Fa1VqfdUM6/iorateR+ASyDpnn6I+5+x19leb1EPChJv2gqrZUTNJ6Vd Dec 6 03:09:43 localhost certmonger[42174]: QdSvthPyF2EwzVWbLRQDIT2UvEpQff2jRYx+6c8QW/NWm2Tm9CfYLhb2 Dec 6 03:09:43 localhost certmonger[42174]: -----END CERTIFICATE REQUEST----- Dec 6 03:09:43 localhost certmonger[42174]: " for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_SPKAC" to "MIICQDCCASgwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDW96LQQqOOz5yT7nY/yKSzgZnEdvOfHHF38fvNFWSL4STXNJ5uY1TJrQtn6MVfv+6kZLrXHge28x//NZwbT5k9pITqfkwyNCK96goAnyxCH14N0x0z2NkaZ6gUep+275BYBohrzYgUIvckEGqyaxTEVqBb50cgZJVjlTng0G+WpGraTYdtflWJIho1Bydy474b3QpX3JewYUMdq4MNd9YoOZuxoZv77eCHWMCoZhCC6pcvVbLpy6PQ69C54XgXCWMFocQo7LNKlfDg399qPHR8eaFCyMdffAn5gdr/K92kgOlysnipc9pWS1mLB1iCLjAUmn+CpY5e7xuYMTGUueJzAgMBAAEWADANBgkqhkiG9w0BAQsFAAOCAQEApSPzT6k9I2VAXirr0H1Q4e6ipg7pmNetlpM2ItrkVZVqMxuoZncig/9Pr7W28LHhN9Qbgb5AH6K6Pde8pgTTFbZua8ifF3CuWTGIiASC+JRyUYAiK7+IW8i/99m9F2zL7jS8mvyHeFcNleFGjMJTMMgo/LPv78+aBjZHB33EG1ZSXDoeohSMBRixVH7CVDwcW1jBX6bgmHHk4M7Bvmdt0ICr3D6fk09GY7EA4wQPhntjuiJ6hybElHKcYULlhPAzUcG+GCD16+fLbMvO/osZbuUyG3blhA4qAEJeZMTz0y0IqcPrCqxS1OUjGyKChEKNvVZhoiYEjJ3zezAcrTE3Cw==" for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_SPKI" to "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA1vei0EKjjs+ck+52P8iks4GZxHbznxxxd/H7zRVki+Ek1zSebmNUya0LZ+jFX7/upGS61x4HtvMf/zWcG0+ZPaSE6n5MMjQiveoKAJ8sQh9eDdMdM9jZGmeoFHqftu+QWAaIa82IFCL3JBBqsmsUxFagW+dHIGSVY5U54NBvlqRq2k2HbX5ViSIaNQcncuO+G90KV9yXsGFDHauDDXfWKDmbsaGb++3gh1jAqGYQguqXL1Wy6cuj0OvQueF4FwljBaHEKOyzSpXw4N/fajx0fHmhQsjHX3wJ+YHa/yvdpIDpcrJ4qXPaVktZiwdYgi4wFJp/gqWOXu8bmDExlLnicwIDAQAB" for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_LOCAL_CA_DIR" to "/var/lib/certmonger/local" for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_KEY_TYPE" to "RSA" for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Setting "CERTMONGER_CA_NICKNAME" to "IPA" for child. Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Redirecting stdin to /dev/null, leaving stdout and stderr open for child "/usr/libexec/certmonger/ipa-submit". Dec 6 03:09:43 localhost certmonger[42174]: 2025-12-06 08:09:43 [42174] Running enrollment helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[42174]: Submitting request to "https://ipa.ooo.test/ipa/json". Dec 6 03:09:43 localhost certmonger[42174]: Certificate: "MIIFaTCCA9GgAwIBAgIBQzANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08uVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4MDk0M1oXDTI3MTIwNzA4MDk0M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNVBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBANb3otBCo47PnJPudj/IpLOBmcR2858ccXfx+80VZIvhJNc0nm5jVMmtC2foxV+/7qRkutceB7bzH/81nBtPmT2khOp+TDI0Ir3qCgCfLEIfXg3THTPY2RpnqBR6n7bvkFgGiGvNiBQi9yQQarJrFMRWoFvnRyBklWOVOeDQb5akatpNh21+VYkiGjUHJ3LjvhvdClfcl7BhQx2rgw131ig5m7Ghm/vt4IdYwKhmEILqly9VsunLo9Dr0LnheBcJYwWhxCjss0qV8ODf32o8dHx5oULIx198CfmB2v8r3aSA6XKyeKlz2lZLWYsHWIIuMBSaf4Kljl7vG5gxMZS54nMCAwEAAaOCAfowggH2MB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEBBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3NwMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwcwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3JsL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFIr4YgUbOPRrcrqQ94dhJSCSLJhqMIHTBgNVHREEgcswgciCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdKBJBgorBgEEAYI3FAIDoDsMOW92bl9jb250cm9sbGVyL25wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBYBgYrBgEFAgKgTjBMoAobCE9PTy5URVNUoT4wPKADAgEBoTUwMxsOb3ZuX2NvbnRyb2xsZXIbIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAFA2msBjmXFm0DpMiSvB99P7qXCIWfVFfkefwcGLT9qujO6sKfo+7XHpjC8C/uqUsFJkaUAXO06Ld3yh2o7RD2RjR3OMhKdv+qusy3Q8YQTldSMYy+sReQzu/a41y4qnXHQRV0Sg9KNX1bKJanZJTeso07PDJQjUR+qJFLSiek0uS3U1nG8qg6zu9o1MW4C1aM2/BFf4Li0KGfr9dgWhQ5YuRXfxN0lAU/oj2idK4an1c8XKar/P018b+qcX2Zs2r/gi8NP3suTjPNVKtfSejEX7sdzk06yDVHjiHEc1iyDbc1HufyYTQwCLwpk2bsgj/77Q7BMMmW3CXdo2ZUp8g5gxbRhmo8U4SgbVzF1MOc+sC5dJQq3lEnxr01vL5aJwGdyODuVLaEPz52JyTsXwI/zg2SVwP5xdLvaMSCbAAvJeZJnnSS0NeRcbjxTy2aI6olDIkhG0a4XMtkiROd4Maz65udfLRA8h1yWCNrU3HuTRNNGBaROAl0yRrfxUJmy3F" Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Certificate submission still ongoing. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Certificate submission attempt complete. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Child status = 0. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Child output: Dec 6 03:09:43 localhost certmonger[37540]: "-----BEGIN CERTIFICATE----- Dec 6 03:09:43 localhost certmonger[37540]: MIIFaTCCA9GgAwIBAgIBQzANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:09:43 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:09:43 localhost certmonger[37540]: MDk0M1oXDTI3MTIwNzA4MDk0M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:09:43 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:09:43 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBANb3otBCo47PnJPudj/IpLOBmcR2858ccXfx Dec 6 03:09:43 localhost certmonger[37540]: +80VZIvhJNc0nm5jVMmtC2foxV+/7qRkutceB7bzH/81nBtPmT2khOp+TDI0Ir3q Dec 6 03:09:43 localhost certmonger[37540]: CgCfLEIfXg3THTPY2RpnqBR6n7bvkFgGiGvNiBQi9yQQarJrFMRWoFvnRyBklWOV Dec 6 03:09:43 localhost certmonger[37540]: OeDQb5akatpNh21+VYkiGjUHJ3LjvhvdClfcl7BhQx2rgw131ig5m7Ghm/vt4IdY Dec 6 03:09:43 localhost certmonger[37540]: wKhmEILqly9VsunLo9Dr0LnheBcJYwWhxCjss0qV8ODf32o8dHx5oULIx198CfmB Dec 6 03:09:43 localhost certmonger[37540]: 2v8r3aSA6XKyeKlz2lZLWYsHWIIuMBSaf4Kljl7vG5gxMZS54nMCAwEAAaOCAfow Dec 6 03:09:43 localhost certmonger[37540]: ggH2MB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:09:43 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:09:43 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:09:43 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:09:43 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:09:43 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFIr4YgUbOPRrcrqQ94dhJSCS Dec 6 03:09:43 localhost certmonger[37540]: LJhqMIHTBgNVHREEgcswgciCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:09:43 localhost certmonger[37540]: dGVzdKBJBgorBgEEAYI3FAIDoDsMOW92bl9jb250cm9sbGVyL25wMDAwNTU0ODc5 Dec 6 03:09:43 localhost certmonger[37540]: OC5pbnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBYBgYrBgEFAgKgTjBMoAob Dec 6 03:09:43 localhost certmonger[37540]: CE9PTy5URVNUoT4wPKADAgEBoTUwMxsOb3ZuX2NvbnRyb2xsZXIbIW5wMDAwNTU0 Dec 6 03:09:43 localhost certmonger[37540]: ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAFA2m Dec 6 03:09:43 localhost certmonger[37540]: sBjmXFm0DpMiSvB99P7qXCIWfVFfkefwcGLT9qujO6sKfo+7XHpjC8C/uqUsFJka Dec 6 03:09:43 localhost certmonger[37540]: UAXO06Ld3yh2o7RD2RjR3OMhKdv+qusy3Q8YQTldSMYy+sReQzu/a41y4qnXHQRV Dec 6 03:09:43 localhost certmonger[37540]: 0Sg9KNX1bKJanZJTeso07PDJQjUR+qJFLSiek0uS3U1nG8qg6zu9o1MW4C1aM2/B Dec 6 03:09:43 localhost certmonger[37540]: Ff4Li0KGfr9dgWhQ5YuRXfxN0lAU/oj2idK4an1c8XKar/P018b+qcX2Zs2r/gi8 Dec 6 03:09:43 localhost certmonger[37540]: NP3suTjPNVKtfSejEX7sdzk06yDVHjiHEc1iyDbc1HufyYTQwCLwpk2bsgj/77Q7 Dec 6 03:09:43 localhost certmonger[37540]: BMMmW3CXdo2ZUp8g5gxbRhmo8U4SgbVzF1MOc+sC5dJQq3lEnxr01vL5aJwGdyOD Dec 6 03:09:43 localhost certmonger[37540]: uVLaEPz52JyTsXwI/zg2SVwP5xdLvaMSCbAAvJeZJnnSS0NeRcbjxTy2aI6olDIk Dec 6 03:09:43 localhost certmonger[37540]: hG0a4XMtkiROd4Maz65udfLRA8h1yWCNrU3HuTRNNGBaROAl0yRrfxUJmy3F Dec 6 03:09:43 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:09:43 localhost certmonger[37540]: " Dec 6 03:09:43 localhost certmonger[42176]: 2025-12-06 08:09:43 [42176] Postprocessing output "-----BEGIN CERTIFICATE----- Dec 6 03:09:43 localhost certmonger[42176]: MIIFaTCCA9GgAwIBAgIBQzANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:09:43 localhost certmonger[42176]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:09:43 localhost certmonger[42176]: MDk0M1oXDTI3MTIwNzA4MDk0M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:09:43 localhost certmonger[42176]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:09:43 localhost certmonger[42176]: hvcNAQEBBQADggEPADCCAQoCggEBANb3otBCo47PnJPudj/IpLOBmcR2858ccXfx Dec 6 03:09:43 localhost certmonger[42176]: +80VZIvhJNc0nm5jVMmtC2foxV+/7qRkutceB7bzH/81nBtPmT2khOp+TDI0Ir3q Dec 6 03:09:43 localhost certmonger[42176]: CgCfLEIfXg3THTPY2RpnqBR6n7bvkFgGiGvNiBQi9yQQarJrFMRWoFvnRyBklWOV Dec 6 03:09:43 localhost certmonger[42176]: OeDQb5akatpNh21+VYkiGjUHJ3LjvhvdClfcl7BhQx2rgw131ig5m7Ghm/vt4IdY Dec 6 03:09:43 localhost certmonger[42176]: wKhmEILqly9VsunLo9Dr0LnheBcJYwWhxCjss0qV8ODf32o8dHx5oULIx198CfmB Dec 6 03:09:43 localhost certmonger[42176]: 2v8r3aSA6XKyeKlz2lZLWYsHWIIuMBSaf4Kljl7vG5gxMZS54nMCAwEAAaOCAfow Dec 6 03:09:43 localhost certmonger[42176]: ggH2MB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:09:43 localhost certmonger[42176]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:09:43 localhost certmonger[42176]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:09:43 localhost certmonger[42176]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:09:43 localhost certmonger[42176]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:09:43 localhost certmonger[42176]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFIr4YgUbOPRrcrqQ94dhJSCS Dec 6 03:09:43 localhost certmonger[42176]: LJhqMIHTBgNVHREEgcswgciCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:09:43 localhost certmonger[42176]: dGVzdKBJBgorBgEEAYI3FAIDoDsMOW92bl9jb250cm9sbGVyL25wMDAwNTU0ODc5 Dec 6 03:09:43 localhost certmonger[42176]: OC5pbnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBYBgYrBgEFAgKgTjBMoAob Dec 6 03:09:43 localhost certmonger[42176]: CE9PTy5URVNUoT4wPKADAgEBoTUwMxsOb3ZuX2NvbnRyb2xsZXIbIW5wMDAwNTU0 Dec 6 03:09:43 localhost certmonger[42176]: ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAFA2m Dec 6 03:09:43 localhost certmonger[42176]: sBjmXFm0DpMiSvB99P7qXCIWfVFfkefwcGLT9qujO6sKfo+7XHpjC8C/uqUsFJka Dec 6 03:09:43 localhost certmonger[42176]: UAXO06Ld3yh2o7RD2RjR3OMhKdv+qusy3Q8YQTldSMYy+sReQzu/a41y4qnXHQRV Dec 6 03:09:43 localhost certmonger[42176]: 0Sg9KNX1bKJanZJTeso07PDJQjUR+qJFLSiek0uS3U1nG8qg6zu9o1MW4C1aM2/B Dec 6 03:09:43 localhost certmonger[42176]: Ff4Li0KGfr9dgWhQ5YuRXfxN0lAU/oj2idK4an1c8XKar/P018b+qcX2Zs2r/gi8 Dec 6 03:09:43 localhost certmonger[42176]: NP3suTjPNVKtfSejEX7sdzk06yDVHjiHEc1iyDbc1HufyYTQwCLwpk2bsgj/77Q7 Dec 6 03:09:43 localhost certmonger[42176]: BMMmW3CXdo2ZUp8g5gxbRhmo8U4SgbVzF1MOc+sC5dJQq3lEnxr01vL5aJwGdyOD Dec 6 03:09:43 localhost certmonger[42176]: uVLaEPz52JyTsXwI/zg2SVwP5xdLvaMSCbAAvJeZJnnSS0NeRcbjxTy2aI6olDIk Dec 6 03:09:43 localhost certmonger[42176]: hG0a4XMtkiROd4Maz65udfLRA8h1yWCNrU3HuTRNNGBaROAl0yRrfxUJmy3F Dec 6 03:09:43 localhost certmonger[42176]: -----END CERTIFICATE----- Dec 6 03:09:43 localhost certmonger[42176]: ". Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Certificate submission still ongoing. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Certificate submission postprocessing complete. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Child status = 0. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Child output: Dec 6 03:09:43 localhost certmonger[37540]: "{"certificate":"-----BEGIN CERTIFICATE-----\nMIIFaTCCA9GgAwIBAgIBQzANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u\nVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4\nMDk0M1oXDTI3MTIwNzA4MDk0M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV\nBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI\nhvcNAQEBBQADggEPADCCAQoCggEBANb3otBCo47PnJPudj/IpLOBmcR2858ccXfx\n+80VZIvhJNc0nm5jVMmtC2foxV+/7qRkutceB7bzH/81nBtPmT2khOp+TDI0Ir3q\nCgCfLEIfXg3THTPY2RpnqBR6n7bvkFgGiGvNiBQi9yQQarJrFMRWoFvnRyBklWOV\nOeDQb5akatpNh21+VYkiGjUHJ3LjvhvdClfcl7BhQx2rgw131ig5m7Ghm/vt4IdY\nwKhmEILqly9VsunLo9Dr0LnheBcJYwWhxCjss0qV8ODf32o8dHx5oULIx198CfmB\n2v8r3aSA6XKyeKlz2lZLWYsHWIIuMBSaf4Kljl7vG5gxMZS54nMCAwEAAaOCAfow\nggH2MB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB\nBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw\nMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw\ncwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js\nL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD\nZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFIr4YgUbOPRrcrqQ94dhJSCS\nLJhqMIHTBgNVHREEgcswgciCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u\ndGVzdKBJBgorBgEEAYI3FAIDoDsMOW92bl9jb250cm9sbGVyL25wMDAwNTU0ODc5\nOC5pbnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBYBgYrBgEFAgKgTjBMoAob\nCE9PTy5URVNUoT4wPKADAgEBoTUwMxsOb3ZuX2NvbnRyb2xsZXIbIW5wMDAwNTU0\nODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAFA2m\nsBjmXFm0DpMiSvB99P7qXCIWfVFfkefwcGLT9qujO6sKfo+7XHpjC8C/uqUsFJka\nUAXO06Ld3yh2o7RD2RjR3OMhKdv+qusy3Q8YQTldSMYy+sReQzu/a41y4qnXHQRV\n0Sg9KNX1bKJanZJTeso07PDJQjUR+qJFLSiek0uS3U1nG8qg6zu9o1MW4C1aM2/B\nFf4Li0KGfr9dgWhQ5YuRXfxN0lAU/oj2idK4an1c8XKar/P018b+qcX2Zs2r/gi8\nNP3suTjPNVKtfSejEX7sdzk06yDVHjiHEc1iyDbc1HufyYTQwCLwpk2bsgj/77Q7\nBMMmW3CXdo2ZUp8g5gxbRhmo8U4SgbVzF1MOc+sC5dJQq3lEnxr01vL5aJwGdyOD\nuVLaEPz52JyTsXwI/zg2SVwP5xdLvaMSCbAAvJeZJnnSS0NeRcbjxTy2aI6olDIk\nhG0a4XMtkiROd4Maz65udfLRA8h1yWCNrU3HuTRNNGBaROAl0yRrfxUJmy3F\n-----END CERTIFICATE-----\n","key_checked":true} Dec 6 03:09:43 localhost certmonger[37540]: " Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Issued certificate is "-----BEGIN CERTIFICATE----- Dec 6 03:09:43 localhost certmonger[37540]: MIIFaTCCA9GgAwIBAgIBQzANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:09:43 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:09:43 localhost certmonger[37540]: MDk0M1oXDTI3MTIwNzA4MDk0M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:09:43 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:09:43 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBANb3otBCo47PnJPudj/IpLOBmcR2858ccXfx Dec 6 03:09:43 localhost certmonger[37540]: +80VZIvhJNc0nm5jVMmtC2foxV+/7qRkutceB7bzH/81nBtPmT2khOp+TDI0Ir3q Dec 6 03:09:43 localhost certmonger[37540]: CgCfLEIfXg3THTPY2RpnqBR6n7bvkFgGiGvNiBQi9yQQarJrFMRWoFvnRyBklWOV Dec 6 03:09:43 localhost certmonger[37540]: OeDQb5akatpNh21+VYkiGjUHJ3LjvhvdClfcl7BhQx2rgw131ig5m7Ghm/vt4IdY Dec 6 03:09:43 localhost certmonger[37540]: wKhmEILqly9VsunLo9Dr0LnheBcJYwWhxCjss0qV8ODf32o8dHx5oULIx198CfmB Dec 6 03:09:43 localhost certmonger[37540]: 2v8r3aSA6XKyeKlz2lZLWYsHWIIuMBSaf4Kljl7vG5gxMZS54nMCAwEAAaOCAfow Dec 6 03:09:43 localhost certmonger[37540]: ggH2MB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:09:43 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:09:43 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:09:43 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:09:43 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:09:43 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFIr4YgUbOPRrcrqQ94dhJSCS Dec 6 03:09:43 localhost certmonger[37540]: LJhqMIHTBgNVHREEgcswgciCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:09:43 localhost certmonger[37540]: dGVzdKBJBgorBgEEAYI3FAIDoDsMOW92bl9jb250cm9sbGVyL25wMDAwNTU0ODc5 Dec 6 03:09:43 localhost certmonger[37540]: OC5pbnRlcm5hbGFwaS5vb28udGVzdEBPT08uVEVTVKBYBgYrBgEFAgKgTjBMoAob Dec 6 03:09:43 localhost certmonger[37540]: CE9PTy5URVNUoT4wPKADAgEBoTUwMxsOb3ZuX2NvbnRyb2xsZXIbIW5wMDAwNTU0 Dec 6 03:09:43 localhost certmonger[37540]: ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDANBgkqhkiG9w0BAQsFAAOCAYEAFA2m Dec 6 03:09:43 localhost certmonger[37540]: sBjmXFm0DpMiSvB99P7qXCIWfVFfkefwcGLT9qujO6sKfo+7XHpjC8C/uqUsFJka Dec 6 03:09:43 localhost certmonger[37540]: UAXO06Ld3yh2o7RD2RjR3OMhKdv+qusy3Q8YQTldSMYy+sReQzu/a41y4qnXHQRV Dec 6 03:09:43 localhost certmonger[37540]: 0Sg9KNX1bKJanZJTeso07PDJQjUR+qJFLSiek0uS3U1nG8qg6zu9o1MW4C1aM2/B Dec 6 03:09:43 localhost certmonger[37540]: Ff4Li0KGfr9dgWhQ5YuRXfxN0lAU/oj2idK4an1c8XKar/P018b+qcX2Zs2r/gi8 Dec 6 03:09:43 localhost certmonger[37540]: NP3suTjPNVKtfSejEX7sdzk06yDVHjiHEc1iyDbc1HufyYTQwCLwpk2bsgj/77Q7 Dec 6 03:09:43 localhost certmonger[37540]: BMMmW3CXdo2ZUp8g5gxbRhmo8U4SgbVzF1MOc+sC5dJQq3lEnxr01vL5aJwGdyOD Dec 6 03:09:43 localhost certmonger[37540]: uVLaEPz52JyTsXwI/zg2SVwP5xdLvaMSCbAAvJeZJnnSS0NeRcbjxTy2aI6olDIk Dec 6 03:09:43 localhost certmonger[37540]: hG0a4XMtkiROd4Maz65udfLRA8h1yWCNrU3HuTRNNGBaROAl0yRrfxUJmy3F Dec 6 03:09:43 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:09:43 localhost certmonger[37540]: ". Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Certificate issued (0 chain certificates, 0 roots). Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] No hooks set for pre-save command. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] No hooks set for post-save command. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:43 localhost certmonger[42180]: Certificate in file "/etc/pki/tls/certs/ovn_controller.crt" issued by CA and saved. Dec 6 03:09:43 localhost certmonger[37540]: 2025-12-06 08:09:43 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 03:09:44 localhost python3[42196]: ansible-setup Invoked with gather_subset=['min'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 03:09:45 localhost python3[42216]: ansible-ansible.legacy.dnf Invoked with name=['python3-pyasn1', 'python3-cryptography', 'python3-dbus'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:09:48 localhost python3[42233]: ansible-ansible.legacy.dnf Invoked with name=['certmonger'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:09:51 localhost python3[42250]: ansible-file Invoked with name=/etc/certmonger//pre-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//pre-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:09:51 localhost python3[42266]: ansible-file Invoked with name=/etc/certmonger//post-scripts owner=root group=root mode=0700 state=directory path=/etc/certmonger//post-scripts recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:09:52 localhost python3[42282]: ansible-ansible.legacy.systemd Invoked with name=certmonger state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:09:52 localhost python3[42300]: ansible-certificate_request Invoked with name=ovn_metadata dns=['np0005548798.internalapi.ooo.test'] principal=['ovn_metadata/np0005548798.internalapi.ooo.test@OOO.TEST'] directory=/etc/pki/tls key_size=2048 wait=True ca=ipa __header=##012# Ansible managed#012##012 provider_config_directory=/etc/certmonger provider=certmonger key_usage=['digitalSignature', 'keyEncipherment'] extended_key_usage=['id-kp-serverAuth', 'id-kp-clientAuth'] auto_renew=True ip=None email=None common_name=None country=None state=None locality=None organization=None organizational_unit=None contact_email=None owner=None group=None run_before=None run_after=None Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:52 localhost certmonger[37540]: 2025-12-06 08:09:52 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_REQ_SUBJECT" to "CN=np0005548798.internalapi.ooo.test" for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_REQ_HOSTNAME" to "np0005548798.internalapi.ooo.test Dec 6 03:09:53 localhost certmonger[42310]: " for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_REQ_PRINCIPAL" to "ovn_metadata/np0005548798.internalapi.ooo.test@OOO.TEST Dec 6 03:09:53 localhost certmonger[42310]: " for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_OPERATION" to "SUBMIT" for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_CSR" to "-----BEGIN CERTIFICATE REQUEST----- Dec 6 03:09:53 localhost certmonger[42310]: MIID4jCCAsoCAQAwLDEqMCgGA1UEAxMhbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBp Dec 6 03:09:53 localhost certmonger[42310]: Lm9vby50ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAv7FfQ4OD Dec 6 03:09:53 localhost certmonger[42310]: +gwDXftIZrK7aLYOziGUlUbBejl5T3TNtCb433TqF1USXOaAozuDKJcS53RoCNtW Dec 6 03:09:53 localhost certmonger[42310]: ymLXqIu8rVPhI7A0x2WmjGO9Pa0OJQMjGrjjVZlNhw+kpiE3sTcdES3Lx5JK2Gis Dec 6 03:09:53 localhost certmonger[42310]: 7HZaN6Kb3pivClTdkmx1a/4IYgyXwxbOi1CDJuDG8iqTQYcnbCJ5QIzCoeHFzXG/ Dec 6 03:09:53 localhost certmonger[42310]: 6yANv4lLeKo0+Dx7vY4QlQrIPbgqMhVk0DS9y8jBPgXdxUAmDEBDk+TlhtaD7Tz6 Dec 6 03:09:53 localhost certmonger[42310]: IXdyWic+RhKhz9V+KFXsa+c6qFW3ULQxhClDjPfBgu+XqAO9nJJMs+kpypIY2Ia3 Dec 6 03:09:53 localhost certmonger[42310]: Cti1T8xM6UqEdQIDAQABoIIBbzArBgkqhkiG9w0BCRQxHh4cADIAMAAyADUAMQAy Dec 6 03:09:53 localhost certmonger[42310]: ADAANgAwADgAMAA5ADUAMjCCAT4GCSqGSIb3DQEJDjGCAS8wggErMAsGA1UdDwQE Dec 6 03:09:53 localhost certmonger[42310]: AwIFoDCBzwYDVR0RBIHHMIHEgiFucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29v Dec 6 03:09:53 localhost certmonger[42310]: LnRlc3SgRwYKKwYBBAGCNxQCA6A5DDdvdm5fbWV0YWRhdGEvbnAwMDA1NTQ4Nzk4 Dec 6 03:09:53 localhost certmonger[42310]: LmludGVybmFsYXBpLm9vby50ZXN0QE9PTy5URVNUoFYGBisGAQUCAqBMMEqgChsI Dec 6 03:09:53 localhost certmonger[42310]: T09PLlRFU1ShPDA6oAMCAQGhMzAxGwxvdm5fbWV0YWRhdGEbIW5wMDAwNTU0ODc5 Dec 6 03:09:53 localhost certmonger[42310]: OC5pbnRlcm5hbGFwaS5vb28udGVzdDAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYB Dec 6 03:09:53 localhost certmonger[42310]: BQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4EFgQUrg6jAR67FWHQjwZ2sU0GQrIZ Dec 6 03:09:53 localhost certmonger[42310]: xoEwDQYJKoZIhvcNAQELBQADggEBAHbM0KjhujQrMNfIx5jjHsixaFT+v0/d8elq Dec 6 03:09:53 localhost certmonger[42310]: 8akYlq4CNxDpJ4fsdXVa3TklvxfXo6t5jl8RkqlWzCsK/8bPooevfNm9wD4iI6Wq Dec 6 03:09:53 localhost certmonger[42310]: cqsrcHIFAgL9KYK4kXSdvEe9xtvWKCH/g5OSA80F7Wovhfcz0nZhvqoyZWGaH87Z Dec 6 03:09:53 localhost certmonger[42310]: gkZ0RFGLTnGErgKqT61yZFP73TDQiQDcKiHLmt2KRwYp80QfSTDv3Bqcosyd+qXA Dec 6 03:09:53 localhost certmonger[42310]: TYV0L7u+UryTYGWfh4dMnjTsSawCVVgyPoiZYav/C68WWGwTWg/biDOxawvnAGkP Dec 6 03:09:53 localhost certmonger[42310]: nyM2T5U+URGixkIGXSUNJbul4XzeqctPQRoGnPyxWKtNw8LWF9g= Dec 6 03:09:53 localhost certmonger[42310]: -----END CERTIFICATE REQUEST----- Dec 6 03:09:53 localhost certmonger[42310]: " for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_SPKAC" to "MIICQDCCASgwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC/sV9Dg4P6DANd+0hmsrtotg7OIZSVRsF6OXlPdM20JvjfdOoXVRJc5oCjO4MolxLndGgI21bKYteoi7ytU+EjsDTHZaaMY709rQ4lAyMauONVmU2HD6SmITexNx0RLcvHkkrYaKzsdlo3opvemK8KVN2SbHVr/ghiDJfDFs6LUIMm4MbyKpNBhydsInlAjMKh4cXNcb/rIA2/iUt4qjT4PHu9jhCVCsg9uCoyFWTQNL3LyME+Bd3FQCYMQEOT5OWG1oPtPPohd3JaJz5GEqHP1X4oVexr5zqoVbdQtDGEKUOM98GC75eoA72ckkyz6SnKkhjYhrcK2LVPzEzpSoR1AgMBAAEWADANBgkqhkiG9w0BAQsFAAOCAQEAWa0tuAb11CscHz8MLsXRR7T4jEVtW5sZnNyIKAuM/cHfyhkvXuiUYc8QP7+6ZEH/7DUmoulV1qa/qKmrgoYxNQ+wEXC5K4cFGMH5pKauJxqBlwlO/QNrS82ZWN6f30NHyJiA01lPJKdguYJD+IWSMOhbsaHrB7/Nyi5s7P5QGxg5mRrzD5PYf0klKpFBV61fkBgsENgvGVkQdZxOkNAe6AmGlLCMkwy52ZmIhXhS4Cukk9E+35ry2T2IKYfM8I5flevd6wSzoDAt8uGcGzo7wFHx/YbiR15e/GpwcUdLDVHuFy96e851HaI/DvNmWhPrvr7fOCbx7WBNESPO2APK9g==" for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_SPKI" to "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAv7FfQ4OD+gwDXftIZrK7aLYOziGUlUbBejl5T3TNtCb433TqF1USXOaAozuDKJcS53RoCNtWymLXqIu8rVPhI7A0x2WmjGO9Pa0OJQMjGrjjVZlNhw+kpiE3sTcdES3Lx5JK2Gis7HZaN6Kb3pivClTdkmx1a/4IYgyXwxbOi1CDJuDG8iqTQYcnbCJ5QIzCoeHFzXG/6yANv4lLeKo0+Dx7vY4QlQrIPbgqMhVk0DS9y8jBPgXdxUAmDEBDk+TlhtaD7Tz6IXdyWic+RhKhz9V+KFXsa+c6qFW3ULQxhClDjPfBgu+XqAO9nJJMs+kpypIY2Ia3Cti1T8xM6UqEdQIDAQAB" for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_LOCAL_CA_DIR" to "/var/lib/certmonger/local" for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_KEY_TYPE" to "RSA" for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Setting "CERTMONGER_CA_NICKNAME" to "IPA" for child. Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Redirecting stdin to /dev/null, leaving stdout and stderr open for child "/usr/libexec/certmonger/ipa-submit". Dec 6 03:09:53 localhost certmonger[42310]: 2025-12-06 08:09:53 [42310] Running enrollment helper "/usr/libexec/certmonger/ipa-submit". Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[42310]: Submitting request to "https://ipa.ooo.test/ipa/json". Dec 6 03:09:53 localhost certmonger[42310]: Certificate: "MIIFZTCCA82gAwIBAgIBRjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08uVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4MDk1M1oXDTI3MTIwNzA4MDk1M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNVBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAL+xX0ODg/oMA137SGayu2i2Ds4hlJVGwXo5eU90zbQm+N906hdVElzmgKM7gyiXEud0aAjbVspi16iLvK1T4SOwNMdlpoxjvT2tDiUDIxq441WZTYcPpKYhN7E3HREty8eSSthorOx2Wjeim96YrwpU3ZJsdWv+CGIMl8MWzotQgybgxvIqk0GHJ2wieUCMwqHhxc1xv+sgDb+JS3iqNPg8e72OEJUKyD24KjIVZNA0vcvIwT4F3cVAJgxAQ5Pk5YbWg+08+iF3clonPkYSoc/VfihV7GvnOqhVt1C0MYQpQ4z3wYLvl6gDvZySTLPpKcqSGNiGtwrYtU/MTOlKhHUCAwEAAaOCAfYwggHyMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEBBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3NwMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwcwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3JsL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFAfbTMF3CjiVIQDPyk/7wrMm0AatMIHPBgNVHREEgccwgcSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdKBHBgorBgEEAYI3FAIDoDkMN292bl9tZXRhZGF0YS9ucDAwMDU1NDg3OTguaW50ZXJuYWxhcGkub29vLnRlc3RAT09PLlRFU1SgVgYGKwYBBQICoEwwSqAKGwhPT08uVEVTVKE8MDqgAwIBAaEzMDEbDG92bl9tZXRhZGF0YRshbnAwMDA1NTQ4Nzk4LmludGVybmFsYXBpLm9vby50ZXN0MA0GCSqGSIb3DQEBCwUAA4IBgQA6EVWv+w3/z/i5NzjrCOLldBvRIS2EbddMA93AAZ5zwDSIul2Es42qulnNi5IoXUZ4L1RzXt6mmcaSmNZm3ptUJYfnN/rDTHfCLI9tqdkSbw7jIR5+DzBbAjbGpvraNIyrImftpa1j0+dJEnt/J6X1gt8cZigCOsnGE21XBltT12m6AQNYiUDk4qnhCeNeZUR/cACTD6kZ8IkSgbCKQwGi5nos/FL8xKAf677yWLmfJJ5fSeqsA2WdsXcI6ffA6NBQqBRpIj83kfrZBPpLn0aDmMU2wEQfKlMEFIa47ue1RFE4cnTNS69U4R7ojiuOqpC6LDiMZBHibuvlhnOkYtdpkeg9Hoy8288deXkCUi8jrf3sKaj6wxzc2E1VKuv7cGtQA+2koxKIxPBR4Gudc9m2RzSfXfwYX9AKA68uLGwE3g1XMFuCz/6POBf65sL2aTB2mstvmeW/javTQqJi3ir2Z1QBaJra0WCg3qsd2bol2KtU+s0LFZX12MabBdNt9jY=" Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Certificate submission still ongoing. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Certificate submission attempt complete. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Child status = 0. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Child output: Dec 6 03:09:53 localhost certmonger[37540]: "-----BEGIN CERTIFICATE----- Dec 6 03:09:53 localhost certmonger[37540]: MIIFZTCCA82gAwIBAgIBRjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:09:53 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:09:53 localhost certmonger[37540]: MDk1M1oXDTI3MTIwNzA4MDk1M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:09:53 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:09:53 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAL+xX0ODg/oMA137SGayu2i2Ds4hlJVGwXo5 Dec 6 03:09:53 localhost certmonger[37540]: eU90zbQm+N906hdVElzmgKM7gyiXEud0aAjbVspi16iLvK1T4SOwNMdlpoxjvT2t Dec 6 03:09:53 localhost certmonger[37540]: DiUDIxq441WZTYcPpKYhN7E3HREty8eSSthorOx2Wjeim96YrwpU3ZJsdWv+CGIM Dec 6 03:09:53 localhost certmonger[37540]: l8MWzotQgybgxvIqk0GHJ2wieUCMwqHhxc1xv+sgDb+JS3iqNPg8e72OEJUKyD24 Dec 6 03:09:53 localhost certmonger[37540]: KjIVZNA0vcvIwT4F3cVAJgxAQ5Pk5YbWg+08+iF3clonPkYSoc/VfihV7GvnOqhV Dec 6 03:09:53 localhost certmonger[37540]: t1C0MYQpQ4z3wYLvl6gDvZySTLPpKcqSGNiGtwrYtU/MTOlKhHUCAwEAAaOCAfYw Dec 6 03:09:53 localhost certmonger[37540]: ggHyMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:09:53 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:09:53 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:09:53 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:09:53 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:09:53 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFAfbTMF3CjiVIQDPyk/7wrMm Dec 6 03:09:53 localhost certmonger[37540]: 0AatMIHPBgNVHREEgccwgcSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:09:53 localhost certmonger[37540]: dGVzdKBHBgorBgEEAYI3FAIDoDkMN292bl9tZXRhZGF0YS9ucDAwMDU1NDg3OTgu Dec 6 03:09:53 localhost certmonger[37540]: aW50ZXJuYWxhcGkub29vLnRlc3RAT09PLlRFU1SgVgYGKwYBBQICoEwwSqAKGwhP Dec 6 03:09:53 localhost certmonger[37540]: T08uVEVTVKE8MDqgAwIBAaEzMDEbDG92bl9tZXRhZGF0YRshbnAwMDA1NTQ4Nzk4 Dec 6 03:09:53 localhost certmonger[37540]: LmludGVybmFsYXBpLm9vby50ZXN0MA0GCSqGSIb3DQEBCwUAA4IBgQA6EVWv+w3/ Dec 6 03:09:53 localhost certmonger[37540]: z/i5NzjrCOLldBvRIS2EbddMA93AAZ5zwDSIul2Es42qulnNi5IoXUZ4L1RzXt6m Dec 6 03:09:53 localhost certmonger[37540]: mcaSmNZm3ptUJYfnN/rDTHfCLI9tqdkSbw7jIR5+DzBbAjbGpvraNIyrImftpa1j Dec 6 03:09:53 localhost certmonger[37540]: 0+dJEnt/J6X1gt8cZigCOsnGE21XBltT12m6AQNYiUDk4qnhCeNeZUR/cACTD6kZ Dec 6 03:09:53 localhost certmonger[37540]: 8IkSgbCKQwGi5nos/FL8xKAf677yWLmfJJ5fSeqsA2WdsXcI6ffA6NBQqBRpIj83 Dec 6 03:09:53 localhost certmonger[37540]: kfrZBPpLn0aDmMU2wEQfKlMEFIa47ue1RFE4cnTNS69U4R7ojiuOqpC6LDiMZBHi Dec 6 03:09:53 localhost certmonger[37540]: buvlhnOkYtdpkeg9Hoy8288deXkCUi8jrf3sKaj6wxzc2E1VKuv7cGtQA+2koxKI Dec 6 03:09:53 localhost certmonger[37540]: xPBR4Gudc9m2RzSfXfwYX9AKA68uLGwE3g1XMFuCz/6POBf65sL2aTB2mstvmeW/ Dec 6 03:09:53 localhost certmonger[37540]: javTQqJi3ir2Z1QBaJra0WCg3qsd2bol2KtU+s0LFZX12MabBdNt9jY= Dec 6 03:09:53 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:09:53 localhost certmonger[37540]: " Dec 6 03:09:53 localhost certmonger[42312]: 2025-12-06 08:09:53 [42312] Postprocessing output "-----BEGIN CERTIFICATE----- Dec 6 03:09:53 localhost certmonger[42312]: MIIFZTCCA82gAwIBAgIBRjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:09:53 localhost certmonger[42312]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:09:53 localhost certmonger[42312]: MDk1M1oXDTI3MTIwNzA4MDk1M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:09:53 localhost certmonger[42312]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:09:53 localhost certmonger[42312]: hvcNAQEBBQADggEPADCCAQoCggEBAL+xX0ODg/oMA137SGayu2i2Ds4hlJVGwXo5 Dec 6 03:09:53 localhost certmonger[42312]: eU90zbQm+N906hdVElzmgKM7gyiXEud0aAjbVspi16iLvK1T4SOwNMdlpoxjvT2t Dec 6 03:09:53 localhost certmonger[42312]: DiUDIxq441WZTYcPpKYhN7E3HREty8eSSthorOx2Wjeim96YrwpU3ZJsdWv+CGIM Dec 6 03:09:53 localhost certmonger[42312]: l8MWzotQgybgxvIqk0GHJ2wieUCMwqHhxc1xv+sgDb+JS3iqNPg8e72OEJUKyD24 Dec 6 03:09:53 localhost certmonger[42312]: KjIVZNA0vcvIwT4F3cVAJgxAQ5Pk5YbWg+08+iF3clonPkYSoc/VfihV7GvnOqhV Dec 6 03:09:53 localhost certmonger[42312]: t1C0MYQpQ4z3wYLvl6gDvZySTLPpKcqSGNiGtwrYtU/MTOlKhHUCAwEAAaOCAfYw Dec 6 03:09:53 localhost certmonger[42312]: ggHyMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:09:53 localhost certmonger[42312]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:09:53 localhost certmonger[42312]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:09:53 localhost certmonger[42312]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:09:53 localhost certmonger[42312]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:09:53 localhost certmonger[42312]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFAfbTMF3CjiVIQDPyk/7wrMm Dec 6 03:09:53 localhost certmonger[42312]: 0AatMIHPBgNVHREEgccwgcSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:09:53 localhost certmonger[42312]: dGVzdKBHBgorBgEEAYI3FAIDoDkMN292bl9tZXRhZGF0YS9ucDAwMDU1NDg3OTgu Dec 6 03:09:53 localhost certmonger[42312]: aW50ZXJuYWxhcGkub29vLnRlc3RAT09PLlRFU1SgVgYGKwYBBQICoEwwSqAKGwhP Dec 6 03:09:53 localhost certmonger[42312]: T08uVEVTVKE8MDqgAwIBAaEzMDEbDG92bl9tZXRhZGF0YRshbnAwMDA1NTQ4Nzk4 Dec 6 03:09:53 localhost certmonger[42312]: LmludGVybmFsYXBpLm9vby50ZXN0MA0GCSqGSIb3DQEBCwUAA4IBgQA6EVWv+w3/ Dec 6 03:09:53 localhost certmonger[42312]: z/i5NzjrCOLldBvRIS2EbddMA93AAZ5zwDSIul2Es42qulnNi5IoXUZ4L1RzXt6m Dec 6 03:09:53 localhost certmonger[42312]: mcaSmNZm3ptUJYfnN/rDTHfCLI9tqdkSbw7jIR5+DzBbAjbGpvraNIyrImftpa1j Dec 6 03:09:53 localhost certmonger[42312]: 0+dJEnt/J6X1gt8cZigCOsnGE21XBltT12m6AQNYiUDk4qnhCeNeZUR/cACTD6kZ Dec 6 03:09:53 localhost certmonger[42312]: 8IkSgbCKQwGi5nos/FL8xKAf677yWLmfJJ5fSeqsA2WdsXcI6ffA6NBQqBRpIj83 Dec 6 03:09:53 localhost certmonger[42312]: kfrZBPpLn0aDmMU2wEQfKlMEFIa47ue1RFE4cnTNS69U4R7ojiuOqpC6LDiMZBHi Dec 6 03:09:53 localhost certmonger[42312]: buvlhnOkYtdpkeg9Hoy8288deXkCUi8jrf3sKaj6wxzc2E1VKuv7cGtQA+2koxKI Dec 6 03:09:53 localhost certmonger[42312]: xPBR4Gudc9m2RzSfXfwYX9AKA68uLGwE3g1XMFuCz/6POBf65sL2aTB2mstvmeW/ Dec 6 03:09:53 localhost certmonger[42312]: javTQqJi3ir2Z1QBaJra0WCg3qsd2bol2KtU+s0LFZX12MabBdNt9jY= Dec 6 03:09:53 localhost certmonger[42312]: -----END CERTIFICATE----- Dec 6 03:09:53 localhost certmonger[42312]: ". Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Certificate submission still ongoing. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Certificate submission postprocessing complete. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Child status = 0. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Child output: Dec 6 03:09:53 localhost certmonger[37540]: "{"certificate":"-----BEGIN CERTIFICATE-----\nMIIFZTCCA82gAwIBAgIBRjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u\nVEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4\nMDk1M1oXDTI3MTIwNzA4MDk1M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV\nBAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI\nhvcNAQEBBQADggEPADCCAQoCggEBAL+xX0ODg/oMA137SGayu2i2Ds4hlJVGwXo5\neU90zbQm+N906hdVElzmgKM7gyiXEud0aAjbVspi16iLvK1T4SOwNMdlpoxjvT2t\nDiUDIxq441WZTYcPpKYhN7E3HREty8eSSthorOx2Wjeim96YrwpU3ZJsdWv+CGIM\nl8MWzotQgybgxvIqk0GHJ2wieUCMwqHhxc1xv+sgDb+JS3iqNPg8e72OEJUKyD24\nKjIVZNA0vcvIwT4F3cVAJgxAQ5Pk5YbWg+08+iF3clonPkYSoc/VfihV7GvnOqhV\nt1C0MYQpQ4z3wYLvl6gDvZySTLPpKcqSGNiGtwrYtU/MTOlKhHUCAwEAAaOCAfYw\nggHyMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB\nBC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw\nMA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw\ncwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js\nL01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD\nZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFAfbTMF3CjiVIQDPyk/7wrMm\n0AatMIHPBgNVHREEgccwgcSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u\ndGVzdKBHBgorBgEEAYI3FAIDoDkMN292bl9tZXRhZGF0YS9ucDAwMDU1NDg3OTgu\naW50ZXJuYWxhcGkub29vLnRlc3RAT09PLlRFU1SgVgYGKwYBBQICoEwwSqAKGwhP\nT08uVEVTVKE8MDqgAwIBAaEzMDEbDG92bl9tZXRhZGF0YRshbnAwMDA1NTQ4Nzk4\nLmludGVybmFsYXBpLm9vby50ZXN0MA0GCSqGSIb3DQEBCwUAA4IBgQA6EVWv+w3/\nz/i5NzjrCOLldBvRIS2EbddMA93AAZ5zwDSIul2Es42qulnNi5IoXUZ4L1RzXt6m\nmcaSmNZm3ptUJYfnN/rDTHfCLI9tqdkSbw7jIR5+DzBbAjbGpvraNIyrImftpa1j\n0+dJEnt/J6X1gt8cZigCOsnGE21XBltT12m6AQNYiUDk4qnhCeNeZUR/cACTD6kZ\n8IkSgbCKQwGi5nos/FL8xKAf677yWLmfJJ5fSeqsA2WdsXcI6ffA6NBQqBRpIj83\nkfrZBPpLn0aDmMU2wEQfKlMEFIa47ue1RFE4cnTNS69U4R7ojiuOqpC6LDiMZBHi\nbuvlhnOkYtdpkeg9Hoy8288deXkCUi8jrf3sKaj6wxzc2E1VKuv7cGtQA+2koxKI\nxPBR4Gudc9m2RzSfXfwYX9AKA68uLGwE3g1XMFuCz/6POBf65sL2aTB2mstvmeW/\njavTQqJi3ir2Z1QBaJra0WCg3qsd2bol2KtU+s0LFZX12MabBdNt9jY=\n-----END CERTIFICATE-----\n","key_checked":true} Dec 6 03:09:53 localhost certmonger[37540]: " Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Issued certificate is "-----BEGIN CERTIFICATE----- Dec 6 03:09:53 localhost certmonger[37540]: MIIFZTCCA82gAwIBAgIBRjANBgkqhkiG9w0BAQsFADAzMREwDwYDVQQKDAhPT08u Dec 6 03:09:53 localhost certmonger[37540]: VEVTVDEeMBwGA1UEAwwVQ2VydGlmaWNhdGUgQXV0aG9yaXR5MB4XDTI1MTIwNjA4 Dec 6 03:09:53 localhost certmonger[37540]: MDk1M1oXDTI3MTIwNzA4MDk1M1owPzERMA8GA1UECgwIT09PLlRFU1QxKjAoBgNV Dec 6 03:09:53 localhost certmonger[37540]: BAMMIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28udGVzdDCCASIwDQYJKoZI Dec 6 03:09:53 localhost certmonger[37540]: hvcNAQEBBQADggEPADCCAQoCggEBAL+xX0ODg/oMA137SGayu2i2Ds4hlJVGwXo5 Dec 6 03:09:53 localhost certmonger[37540]: eU90zbQm+N906hdVElzmgKM7gyiXEud0aAjbVspi16iLvK1T4SOwNMdlpoxjvT2t Dec 6 03:09:53 localhost certmonger[37540]: DiUDIxq441WZTYcPpKYhN7E3HREty8eSSthorOx2Wjeim96YrwpU3ZJsdWv+CGIM Dec 6 03:09:53 localhost certmonger[37540]: l8MWzotQgybgxvIqk0GHJ2wieUCMwqHhxc1xv+sgDb+JS3iqNPg8e72OEJUKyD24 Dec 6 03:09:53 localhost certmonger[37540]: KjIVZNA0vcvIwT4F3cVAJgxAQ5Pk5YbWg+08+iF3clonPkYSoc/VfihV7GvnOqhV Dec 6 03:09:53 localhost certmonger[37540]: t1C0MYQpQ4z3wYLvl6gDvZySTLPpKcqSGNiGtwrYtU/MTOlKhHUCAwEAAaOCAfYw Dec 6 03:09:53 localhost certmonger[37540]: ggHyMB8GA1UdIwQYMBaAFIs3rIMoGWlNC4HmOkB57ac/y2b7MDoGCCsGAQUFBwEB Dec 6 03:09:53 localhost certmonger[37540]: BC4wLDAqBggrBgEFBQcwAYYeaHR0cDovL2lwYS1jYS5vb28udGVzdC9jYS9vY3Nw Dec 6 03:09:53 localhost certmonger[37540]: MA4GA1UdDwEB/wQEAwIE8DAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIw Dec 6 03:09:53 localhost certmonger[37540]: cwYDVR0fBGwwajBooDCgLoYsaHR0cDovL2lwYS1jYS5vb28udGVzdC9pcGEvY3Js Dec 6 03:09:53 localhost certmonger[37540]: L01hc3RlckNSTC5iaW6iNKQyMDAxDjAMBgNVBAoMBWlwYWNhMR4wHAYDVQQDDBVD Dec 6 03:09:53 localhost certmonger[37540]: ZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHQYDVR0OBBYEFAfbTMF3CjiVIQDPyk/7wrMm Dec 6 03:09:53 localhost certmonger[37540]: 0AatMIHPBgNVHREEgccwgcSCIW5wMDAwNTU0ODc5OC5pbnRlcm5hbGFwaS5vb28u Dec 6 03:09:53 localhost certmonger[37540]: dGVzdKBHBgorBgEEAYI3FAIDoDkMN292bl9tZXRhZGF0YS9ucDAwMDU1NDg3OTgu Dec 6 03:09:53 localhost certmonger[37540]: aW50ZXJuYWxhcGkub29vLnRlc3RAT09PLlRFU1SgVgYGKwYBBQICoEwwSqAKGwhP Dec 6 03:09:53 localhost certmonger[37540]: T08uVEVTVKE8MDqgAwIBAaEzMDEbDG92bl9tZXRhZGF0YRshbnAwMDA1NTQ4Nzk4 Dec 6 03:09:53 localhost certmonger[37540]: LmludGVybmFsYXBpLm9vby50ZXN0MA0GCSqGSIb3DQEBCwUAA4IBgQA6EVWv+w3/ Dec 6 03:09:53 localhost certmonger[37540]: z/i5NzjrCOLldBvRIS2EbddMA93AAZ5zwDSIul2Es42qulnNi5IoXUZ4L1RzXt6m Dec 6 03:09:53 localhost certmonger[37540]: mcaSmNZm3ptUJYfnN/rDTHfCLI9tqdkSbw7jIR5+DzBbAjbGpvraNIyrImftpa1j Dec 6 03:09:53 localhost certmonger[37540]: 0+dJEnt/J6X1gt8cZigCOsnGE21XBltT12m6AQNYiUDk4qnhCeNeZUR/cACTD6kZ Dec 6 03:09:53 localhost certmonger[37540]: 8IkSgbCKQwGi5nos/FL8xKAf677yWLmfJJ5fSeqsA2WdsXcI6ffA6NBQqBRpIj83 Dec 6 03:09:53 localhost certmonger[37540]: kfrZBPpLn0aDmMU2wEQfKlMEFIa47ue1RFE4cnTNS69U4R7ojiuOqpC6LDiMZBHi Dec 6 03:09:53 localhost certmonger[37540]: buvlhnOkYtdpkeg9Hoy8288deXkCUi8jrf3sKaj6wxzc2E1VKuv7cGtQA+2koxKI Dec 6 03:09:53 localhost certmonger[37540]: xPBR4Gudc9m2RzSfXfwYX9AKA68uLGwE3g1XMFuCz/6POBf65sL2aTB2mstvmeW/ Dec 6 03:09:53 localhost certmonger[37540]: javTQqJi3ir2Z1QBaJra0WCg3qsd2bol2KtU+s0LFZX12MabBdNt9jY= Dec 6 03:09:53 localhost certmonger[37540]: -----END CERTIFICATE----- Dec 6 03:09:53 localhost certmonger[37540]: ". Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Certificate issued (0 chain certificates, 0 roots). Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] No hooks set for pre-save command. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] No hooks set for post-save command. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:53 localhost certmonger[42316]: Certificate in file "/etc/pki/tls/certs/ovn_metadata.crt" issued by CA and saved. Dec 6 03:09:53 localhost certmonger[37540]: 2025-12-06 08:09:53 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 03:09:54 localhost python3[42332]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:09:55 localhost ansible-async_wrapper.py[42504]: Invoked with 742139336412 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008595.3430808-110856-39420647260452/AnsiballZ_command.py _ Dec 6 03:09:55 localhost ansible-async_wrapper.py[42507]: Starting module and watcher Dec 6 03:09:55 localhost ansible-async_wrapper.py[42507]: Start watching 42508 (3600) Dec 6 03:09:55 localhost ansible-async_wrapper.py[42508]: Start module (42508) Dec 6 03:09:55 localhost ansible-async_wrapper.py[42504]: Return async_wrapper task started. Dec 6 03:09:56 localhost python3[42526]: ansible-ansible.legacy.async_status Invoked with jid=742139336412.42504 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:10:00 localhost ansible-async_wrapper.py[42507]: 42508 still running (3600) Dec 6 03:10:01 localhost puppet-user[42528]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:01 localhost puppet-user[42528]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:01 localhost puppet-user[42528]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:01 localhost puppet-user[42528]: (file & line not available) Dec 6 03:10:01 localhost puppet-user[42528]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:01 localhost puppet-user[42528]: (file & line not available) Dec 6 03:10:01 localhost puppet-user[42528]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 6 03:10:01 localhost puppet-user[42528]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 6 03:10:01 localhost puppet-user[42528]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.12 seconds Dec 6 03:10:01 localhost puppet-user[42528]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Dec 6 03:10:01 localhost puppet-user[42528]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Dec 6 03:10:01 localhost puppet-user[42528]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Dec 6 03:10:01 localhost puppet-user[42528]: Notice: Applied catalog in 0.05 seconds Dec 6 03:10:01 localhost puppet-user[42528]: Application: Dec 6 03:10:01 localhost puppet-user[42528]: Initial environment: production Dec 6 03:10:01 localhost puppet-user[42528]: Converged environment: production Dec 6 03:10:01 localhost puppet-user[42528]: Run mode: user Dec 6 03:10:01 localhost puppet-user[42528]: Changes: Dec 6 03:10:01 localhost puppet-user[42528]: Total: 3 Dec 6 03:10:01 localhost puppet-user[42528]: Events: Dec 6 03:10:01 localhost puppet-user[42528]: Success: 3 Dec 6 03:10:01 localhost puppet-user[42528]: Total: 3 Dec 6 03:10:01 localhost puppet-user[42528]: Resources: Dec 6 03:10:01 localhost puppet-user[42528]: Changed: 3 Dec 6 03:10:01 localhost puppet-user[42528]: Out of sync: 3 Dec 6 03:10:01 localhost puppet-user[42528]: Total: 10 Dec 6 03:10:01 localhost puppet-user[42528]: Time: Dec 6 03:10:01 localhost puppet-user[42528]: Schedule: 0.00 Dec 6 03:10:01 localhost puppet-user[42528]: File: 0.00 Dec 6 03:10:01 localhost puppet-user[42528]: Augeas: 0.02 Dec 6 03:10:01 localhost puppet-user[42528]: Exec: 0.02 Dec 6 03:10:01 localhost puppet-user[42528]: Transaction evaluation: 0.05 Dec 6 03:10:01 localhost puppet-user[42528]: Catalog application: 0.05 Dec 6 03:10:01 localhost puppet-user[42528]: Config retrieval: 0.16 Dec 6 03:10:01 localhost puppet-user[42528]: Last run: 1765008601 Dec 6 03:10:01 localhost puppet-user[42528]: Filebucket: 0.00 Dec 6 03:10:01 localhost puppet-user[42528]: Total: 0.05 Dec 6 03:10:01 localhost puppet-user[42528]: Version: Dec 6 03:10:01 localhost puppet-user[42528]: Config: 1765008601 Dec 6 03:10:01 localhost puppet-user[42528]: Puppet: 7.10.0 Dec 6 03:10:01 localhost ansible-async_wrapper.py[42508]: Module complete (42508) Dec 6 03:10:05 localhost ansible-async_wrapper.py[42507]: Done in kid B. Dec 6 03:10:06 localhost python3[42659]: ansible-ansible.legacy.async_status Invoked with jid=742139336412.42504 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:10:07 localhost python3[42675]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:10:07 localhost python3[42691]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:10:08 localhost python3[42739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:10:08 localhost python3[42782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008607.7485564-111415-212372155328484/source _original_basename=tmp4l3psiwy follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:10:08 localhost python3[42812]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:10 localhost python3[42915]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 6 03:10:10 localhost python3[42934]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 03:10:10 localhost python3[42950]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005548798 step=1 update_config_hash_only=False Dec 6 03:10:11 localhost python3[42966]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:11 localhost python3[42982]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 6 03:10:12 localhost python3[42998]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 6 03:10:12 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:10:12 localhost python3[43025]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Dec 6 03:10:13 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:10:13 localhost podman[43198]: 2025-12-06 08:10:13.29380705 +0000 UTC m=+0.070712184 container create 5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_puppet_step1, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:44:13Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:10:13 localhost podman[43207]: 2025-12-06 08:10:13.315277777 +0000 UTC m=+0.081267041 container create 66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=) Dec 6 03:10:13 localhost systemd[1]: Created slice Virtual Machine and Container Slice. Dec 6 03:10:13 localhost systemd[1]: Started libpod-conmon-5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78.scope. Dec 6 03:10:13 localhost podman[43220]: 2025-12-06 08:10:13.33880711 +0000 UTC m=+0.086870191 container create f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-nova_libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Dec 6 03:10:13 localhost systemd[1]: Started libpod-conmon-66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9.scope. Dec 6 03:10:13 localhost podman[43198]: 2025-12-06 08:10:13.251424314 +0000 UTC m=+0.028329438 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 6 03:10:13 localhost systemd[1]: Started libcrun container. Dec 6 03:10:13 localhost systemd[1]: Started libcrun container. Dec 6 03:10:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fbb5bd577b683508ca5a4bf3a3d7e7267a27d2ecf2e71900776c8f2f269256e/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dae190d40250f7df03793838b96be5e7fd6c282a4757e117b727e7855041c6b0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fbb5bd577b683508ca5a4bf3a3d7e7267a27d2ecf2e71900776c8f2f269256e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:13 localhost podman[43224]: 2025-12-06 08:10:13.364054297 +0000 UTC m=+0.110147365 container create 646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, version=17.1.12, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=container-puppet-metrics_qdr, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true) Dec 6 03:10:13 localhost systemd[1]: Started libpod-conmon-f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2.scope. Dec 6 03:10:13 localhost podman[43207]: 2025-12-06 08:10:13.283025255 +0000 UTC m=+0.049014549 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 6 03:10:13 localhost podman[43207]: 2025-12-06 08:10:13.383305973 +0000 UTC m=+0.149295267 container init 66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, distribution-scope=public, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=container-puppet-collectd) Dec 6 03:10:13 localhost podman[43225]: 2025-12-06 08:10:13.283541272 +0000 UTC m=+0.026553051 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 6 03:10:13 localhost podman[43220]: 2025-12-06 08:10:13.283172919 +0000 UTC m=+0.031236010 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:10:13 localhost podman[43198]: 2025-12-06 08:10:13.387506598 +0000 UTC m=+0.164411712 container init 5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=container-puppet-iscsid) Dec 6 03:10:13 localhost podman[43225]: 2025-12-06 08:10:13.389936536 +0000 UTC m=+0.132948285 container create bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, version=17.1.12, com.redhat.component=openstack-cron-container, container_name=container-puppet-crond, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:10:13 localhost podman[43224]: 2025-12-06 08:10:13.293049766 +0000 UTC m=+0.039142794 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 6 03:10:13 localhost podman[43198]: 2025-12-06 08:10:13.39535421 +0000 UTC m=+0.172259344 container start 5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_puppet_step1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 6 03:10:13 localhost podman[43198]: 2025-12-06 08:10:13.39724745 +0000 UTC m=+0.174152564 container attach 5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, container_name=container-puppet-iscsid, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4) Dec 6 03:10:13 localhost systemd[1]: Started libpod-conmon-646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e.scope. Dec 6 03:10:13 localhost systemd[1]: Started libcrun container. Dec 6 03:10:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7603e0cea2115c7f1b9c23d551d73dbd4a38d1911aefa6f002a0812093908818/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:13 localhost systemd[1]: Started libcrun container. Dec 6 03:10:13 localhost podman[43220]: 2025-12-06 08:10:13.407944442 +0000 UTC m=+0.156007523 container init f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, build-date=2025-11-19T00:35:22Z, release=1761123044, url=https://www.redhat.com, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-nova_libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:10:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdaa822da5273b8b58f454c368ddec1badb6f06929e8c5917413151ec2935f51/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:13 localhost podman[43220]: 2025-12-06 08:10:13.416484785 +0000 UTC m=+0.164547876 container start f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=container-puppet-nova_libvirt, distribution-scope=public, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:10:13 localhost podman[43220]: 2025-12-06 08:10:13.416702962 +0000 UTC m=+0.164766053 container attach f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-nova_libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_puppet_step1, version=17.1.12, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 03:10:13 localhost podman[43207]: 2025-12-06 08:10:13.44257526 +0000 UTC m=+0.208564524 container start 66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Dec 6 03:10:13 localhost podman[43207]: 2025-12-06 08:10:13.442999714 +0000 UTC m=+0.208989018 container attach 66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, version=17.1.12, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:10:13 localhost podman[43224]: 2025-12-06 08:10:13.476193425 +0000 UTC m=+0.222286443 container init 646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=container-puppet-metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:10:13 localhost systemd[1]: Started libpod-conmon-bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d.scope. Dec 6 03:10:13 localhost systemd[1]: tmp-crun.M7TS8s.mount: Deactivated successfully. Dec 6 03:10:13 localhost systemd[1]: Started libcrun container. Dec 6 03:10:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21122d1ceef7fa397145a1e2df0de098a59ef7bc976e6dd7526001fbdedc477d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:13 localhost podman[43224]: 2025-12-06 08:10:13.672928041 +0000 UTC m=+0.419021079 container start 646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_puppet_step1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=container-puppet-metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 03:10:13 localhost podman[43224]: 2025-12-06 08:10:13.673231051 +0000 UTC m=+0.419324139 container attach 646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=container-puppet-metrics_qdr, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:10:13 localhost podman[43225]: 2025-12-06 08:10:13.702719064 +0000 UTC m=+0.445730833 container init bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true) Dec 6 03:10:13 localhost podman[43225]: 2025-12-06 08:10:13.709953486 +0000 UTC m=+0.452965265 container start bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc.) Dec 6 03:10:13 localhost podman[43225]: 2025-12-06 08:10:13.713108686 +0000 UTC m=+0.456120515 container attach bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, distribution-scope=public, name=rhosp17/openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=container-puppet-crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1) Dec 6 03:10:15 localhost ovs-vsctl[43438]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Dec 6 03:10:15 localhost podman[43108]: 2025-12-06 08:10:13.175606258 +0000 UTC m=+0.037432928 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Dec 6 03:10:15 localhost puppet-user[43321]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:15 localhost puppet-user[43321]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:15 localhost puppet-user[43321]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:15 localhost puppet-user[43321]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43319]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:15 localhost puppet-user[43319]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:15 localhost puppet-user[43319]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:15 localhost puppet-user[43319]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43321]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:15 localhost puppet-user[43321]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43319]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:15 localhost puppet-user[43319]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43326]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:15 localhost puppet-user[43326]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:15 localhost puppet-user[43326]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43326]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:15 localhost puppet-user[43326]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43321]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.11 seconds Dec 6 03:10:15 localhost podman[43628]: 2025-12-06 08:10:15.318915387 +0000 UTC m=+0.081826769 container create b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, tcib_managed=true, io.openshift.expose-services=, container_name=container-puppet-ceilometer, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, name=rhosp17/openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:59Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com) Dec 6 03:10:15 localhost puppet-user[43321]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Dec 6 03:10:15 localhost puppet-user[43321]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Dec 6 03:10:15 localhost puppet-user[43321]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Dec 6 03:10:15 localhost systemd[1]: Started libpod-conmon-b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc.scope. Dec 6 03:10:15 localhost systemd[1]: Started libcrun container. Dec 6 03:10:15 localhost podman[43628]: 2025-12-06 08:10:15.271624843 +0000 UTC m=+0.034536225 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Dec 6 03:10:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bedaabc8d1e6b69e205ba04f54934e85f233d1071cfe4f6fa7419a243a5303a2/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:15 localhost podman[43628]: 2025-12-06 08:10:15.379523466 +0000 UTC m=+0.142434848 container init b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.expose-services=, container_name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, config_id=tripleo_puppet_step1, name=rhosp17/openstack-ceilometer-central, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 6 03:10:15 localhost podman[43628]: 2025-12-06 08:10:15.392568853 +0000 UTC m=+0.155480225 container start b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:11:59Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=container-puppet-ceilometer, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:10:15 localhost podman[43628]: 2025-12-06 08:10:15.392894904 +0000 UTC m=+0.155806306 container attach b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, container_name=container-puppet-ceilometer, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-central, io.openshift.expose-services=, config_id=tripleo_puppet_step1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:59Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, release=1761123044) Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Dec 6 03:10:15 localhost puppet-user[43326]: in a future release. Use nova::cinder::os_region_name instead Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Dec 6 03:10:15 localhost puppet-user[43326]: in a future release. Use nova::cinder::catalog_info instead Dec 6 03:10:15 localhost puppet-user[43319]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.31 seconds Dec 6 03:10:15 localhost puppet-user[43344]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:15 localhost puppet-user[43344]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:15 localhost puppet-user[43344]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:15 localhost puppet-user[43344]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43358]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:15 localhost puppet-user[43358]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:15 localhost puppet-user[43358]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:15 localhost puppet-user[43358]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43344]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:15 localhost puppet-user[43344]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43358]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:15 localhost puppet-user[43358]: (file & line not available) Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Dec 6 03:10:15 localhost puppet-user[43358]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.07 seconds Dec 6 03:10:15 localhost puppet-user[43344]: Notice: Accepting previously invalid value for target type 'Integer' Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Dec 6 03:10:15 localhost puppet-user[43344]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.13 seconds Dec 6 03:10:15 localhost puppet-user[43344]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Dec 6 03:10:15 localhost puppet-user[43344]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Dec 6 03:10:15 localhost puppet-user[43344]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Dec 6 03:10:15 localhost puppet-user[43358]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Dec 6 03:10:15 localhost puppet-user[43344]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Dec 6 03:10:15 localhost puppet-user[43344]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}936525d2183f84e0244921a902eb4786dae0d29061eb1701ac7d9a153787c3b8' Dec 6 03:10:15 localhost puppet-user[43344]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Dec 6 03:10:15 localhost puppet-user[43344]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Dec 6 03:10:15 localhost puppet-user[43344]: Notice: Applied catalog in 0.03 seconds Dec 6 03:10:15 localhost puppet-user[43344]: Application: Dec 6 03:10:15 localhost puppet-user[43344]: Initial environment: production Dec 6 03:10:15 localhost puppet-user[43344]: Converged environment: production Dec 6 03:10:15 localhost puppet-user[43344]: Run mode: user Dec 6 03:10:15 localhost puppet-user[43344]: Changes: Dec 6 03:10:15 localhost puppet-user[43344]: Total: 7 Dec 6 03:10:15 localhost puppet-user[43344]: Events: Dec 6 03:10:15 localhost puppet-user[43344]: Success: 7 Dec 6 03:10:15 localhost puppet-user[43344]: Total: 7 Dec 6 03:10:15 localhost puppet-user[43344]: Resources: Dec 6 03:10:15 localhost puppet-user[43344]: Skipped: 13 Dec 6 03:10:15 localhost puppet-user[43344]: Changed: 5 Dec 6 03:10:15 localhost puppet-user[43344]: Out of sync: 5 Dec 6 03:10:15 localhost puppet-user[43344]: Total: 20 Dec 6 03:10:15 localhost puppet-user[43344]: Time: Dec 6 03:10:15 localhost puppet-user[43344]: File: 0.01 Dec 6 03:10:15 localhost puppet-user[43344]: Transaction evaluation: 0.03 Dec 6 03:10:15 localhost puppet-user[43344]: Catalog application: 0.03 Dec 6 03:10:15 localhost puppet-user[43344]: Config retrieval: 0.17 Dec 6 03:10:15 localhost puppet-user[43344]: Last run: 1765008615 Dec 6 03:10:15 localhost puppet-user[43344]: Total: 0.03 Dec 6 03:10:15 localhost puppet-user[43344]: Version: Dec 6 03:10:15 localhost puppet-user[43344]: Config: 1765008615 Dec 6 03:10:15 localhost puppet-user[43344]: Puppet: 7.10.0 Dec 6 03:10:15 localhost puppet-user[43358]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Dec 6 03:10:15 localhost puppet-user[43358]: Notice: Applied catalog in 0.10 seconds Dec 6 03:10:15 localhost puppet-user[43358]: Application: Dec 6 03:10:15 localhost puppet-user[43358]: Initial environment: production Dec 6 03:10:15 localhost puppet-user[43358]: Converged environment: production Dec 6 03:10:15 localhost puppet-user[43358]: Run mode: user Dec 6 03:10:15 localhost puppet-user[43358]: Changes: Dec 6 03:10:15 localhost puppet-user[43358]: Total: 2 Dec 6 03:10:15 localhost puppet-user[43358]: Events: Dec 6 03:10:15 localhost puppet-user[43358]: Success: 2 Dec 6 03:10:15 localhost puppet-user[43358]: Total: 2 Dec 6 03:10:15 localhost puppet-user[43358]: Resources: Dec 6 03:10:15 localhost puppet-user[43358]: Changed: 2 Dec 6 03:10:15 localhost puppet-user[43358]: Out of sync: 2 Dec 6 03:10:15 localhost puppet-user[43358]: Skipped: 7 Dec 6 03:10:15 localhost puppet-user[43358]: Total: 9 Dec 6 03:10:15 localhost puppet-user[43358]: Time: Dec 6 03:10:15 localhost puppet-user[43358]: Cron: 0.01 Dec 6 03:10:15 localhost puppet-user[43358]: File: 0.05 Dec 6 03:10:15 localhost puppet-user[43358]: Transaction evaluation: 0.09 Dec 6 03:10:15 localhost puppet-user[43358]: Config retrieval: 0.10 Dec 6 03:10:15 localhost puppet-user[43358]: Catalog application: 0.10 Dec 6 03:10:15 localhost puppet-user[43358]: Last run: 1765008615 Dec 6 03:10:15 localhost puppet-user[43358]: Total: 0.10 Dec 6 03:10:15 localhost puppet-user[43358]: Version: Dec 6 03:10:15 localhost puppet-user[43358]: Config: 1765008615 Dec 6 03:10:15 localhost puppet-user[43358]: Puppet: 7.10.0 Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43321]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Dec 6 03:10:15 localhost puppet-user[43321]: Notice: Applied catalog in 0.49 seconds Dec 6 03:10:15 localhost puppet-user[43321]: Application: Dec 6 03:10:15 localhost puppet-user[43321]: Initial environment: production Dec 6 03:10:15 localhost puppet-user[43321]: Converged environment: production Dec 6 03:10:15 localhost puppet-user[43321]: Run mode: user Dec 6 03:10:15 localhost puppet-user[43321]: Changes: Dec 6 03:10:15 localhost puppet-user[43321]: Total: 4 Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Dec 6 03:10:15 localhost puppet-user[43321]: Events: Dec 6 03:10:15 localhost puppet-user[43321]: Success: 4 Dec 6 03:10:15 localhost puppet-user[43321]: Total: 4 Dec 6 03:10:15 localhost puppet-user[43321]: Resources: Dec 6 03:10:15 localhost puppet-user[43321]: Changed: 4 Dec 6 03:10:15 localhost puppet-user[43321]: Out of sync: 4 Dec 6 03:10:15 localhost puppet-user[43321]: Skipped: 8 Dec 6 03:10:15 localhost puppet-user[43321]: Total: 13 Dec 6 03:10:15 localhost puppet-user[43321]: Time: Dec 6 03:10:15 localhost puppet-user[43321]: File: 0.00 Dec 6 03:10:15 localhost puppet-user[43321]: Exec: 0.05 Dec 6 03:10:15 localhost puppet-user[43321]: Config retrieval: 0.14 Dec 6 03:10:15 localhost puppet-user[43321]: Augeas: 0.43 Dec 6 03:10:15 localhost puppet-user[43321]: Transaction evaluation: 0.49 Dec 6 03:10:15 localhost puppet-user[43321]: Catalog application: 0.49 Dec 6 03:10:15 localhost puppet-user[43321]: Last run: 1765008615 Dec 6 03:10:15 localhost puppet-user[43321]: Total: 0.49 Dec 6 03:10:15 localhost puppet-user[43321]: Version: Dec 6 03:10:15 localhost puppet-user[43321]: Config: 1765008615 Dec 6 03:10:15 localhost puppet-user[43321]: Puppet: 7.10.0 Dec 6 03:10:15 localhost puppet-user[43326]: Warning: Unknown variable: '::nova::compute::libvirt::manage_libvirt_services'. (file: /etc/puppet/modules/nova/manifests/migration/libvirt.pp, line: 314, column: 33) Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Dec 6 03:10:15 localhost puppet-user[43319]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Dec 6 03:10:15 localhost puppet-user[43319]: Notice: Applied catalog in 0.24 seconds Dec 6 03:10:15 localhost puppet-user[43319]: Application: Dec 6 03:10:15 localhost puppet-user[43319]: Initial environment: production Dec 6 03:10:15 localhost puppet-user[43319]: Converged environment: production Dec 6 03:10:15 localhost puppet-user[43319]: Run mode: user Dec 6 03:10:15 localhost puppet-user[43319]: Changes: Dec 6 03:10:15 localhost puppet-user[43319]: Total: 42 Dec 6 03:10:15 localhost puppet-user[43319]: Events: Dec 6 03:10:15 localhost puppet-user[43319]: Success: 42 Dec 6 03:10:15 localhost puppet-user[43319]: Total: 42 Dec 6 03:10:15 localhost puppet-user[43319]: Resources: Dec 6 03:10:15 localhost puppet-user[43319]: Skipped: 13 Dec 6 03:10:15 localhost puppet-user[43319]: Changed: 37 Dec 6 03:10:15 localhost puppet-user[43319]: Out of sync: 37 Dec 6 03:10:15 localhost puppet-user[43319]: Total: 78 Dec 6 03:10:15 localhost puppet-user[43319]: Time: Dec 6 03:10:15 localhost puppet-user[43319]: Concat fragment: 0.00 Dec 6 03:10:15 localhost puppet-user[43319]: File: 0.13 Dec 6 03:10:15 localhost puppet-user[43319]: Transaction evaluation: 0.23 Dec 6 03:10:15 localhost puppet-user[43319]: Catalog application: 0.24 Dec 6 03:10:15 localhost puppet-user[43319]: Config retrieval: 0.42 Dec 6 03:10:15 localhost puppet-user[43319]: Last run: 1765008615 Dec 6 03:10:15 localhost puppet-user[43319]: Concat file: 0.00 Dec 6 03:10:15 localhost puppet-user[43319]: Total: 0.24 Dec 6 03:10:15 localhost puppet-user[43319]: Version: Dec 6 03:10:15 localhost puppet-user[43319]: Config: 1765008615 Dec 6 03:10:15 localhost puppet-user[43319]: Puppet: 7.10.0 Dec 6 03:10:16 localhost systemd[1]: libpod-646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e.scope: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: libpod-646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e.scope: Consumed 2.184s CPU time. Dec 6 03:10:16 localhost podman[43224]: 2025-12-06 08:10:16.025400272 +0000 UTC m=+2.771493290 container died 646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, container_name=container-puppet-metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4) Dec 6 03:10:16 localhost systemd[1]: libpod-bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d.scope: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: libpod-bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d.scope: Consumed 2.122s CPU time. Dec 6 03:10:16 localhost systemd[1]: libpod-5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78.scope: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: libpod-5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78.scope: Consumed 2.513s CPU time. Dec 6 03:10:16 localhost podman[43198]: 2025-12-06 08:10:16.123021055 +0000 UTC m=+2.899926169 container died 5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, container_name=container-puppet-iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:10:16 localhost podman[43870]: 2025-12-06 08:10:16.14407986 +0000 UTC m=+0.106938873 container cleanup 646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-metrics_qdr, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Dec 6 03:10:16 localhost systemd[1]: libpod-conmon-646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e.scope: Deactivated successfully. Dec 6 03:10:16 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 6 03:10:16 localhost podman[43225]: 2025-12-06 08:10:16.180647 +0000 UTC m=+2.923658839 container died bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:10:16 localhost podman[43874]: 2025-12-06 08:10:16.257594241 +0000 UTC m=+0.215001880 container cleanup bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:10:16 localhost systemd[1]: libpod-conmon-bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d.scope: Deactivated successfully. Dec 6 03:10:16 localhost podman[43910]: 2025-12-06 08:10:16.263418597 +0000 UTC m=+0.132371156 container cleanup 5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, container_name=container-puppet-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Dec 6 03:10:16 localhost systemd[1]: libpod-conmon-5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78.scope: Deactivated successfully. Dec 6 03:10:16 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 6 03:10:16 localhost systemd[1]: libpod-66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9.scope: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: libpod-66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9.scope: Consumed 2.635s CPU time. Dec 6 03:10:16 localhost podman[43207]: 2025-12-06 08:10:16.306551868 +0000 UTC m=+3.072541142 container died 66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, url=https://www.redhat.com, name=rhosp17/openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:10:16 localhost systemd[1]: tmp-crun.SlOFe0.mount: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay-21122d1ceef7fa397145a1e2df0de098a59ef7bc976e6dd7526001fbdedc477d-merged.mount: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb9d5e3f6bfd2cf57b3bc87eff8a576fe638ef0278eceaf63bdd402c7f13421d-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay-bdaa822da5273b8b58f454c368ddec1badb6f06929e8c5917413151ec2935f51-merged.mount: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay-4fbb5bd577b683508ca5a4bf3a3d7e7267a27d2ecf2e71900776c8f2f269256e-merged.mount: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b5dc0dff070052bedaf26b53e4ad4b12c4b3af7b9b4eefdd34b71602ef58b78-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:16 localhost puppet-user[43326]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 1.15 seconds Dec 6 03:10:16 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay-dae190d40250f7df03793838b96be5e7fd6c282a4757e117b727e7855041c6b0-merged.mount: Deactivated successfully. Dec 6 03:10:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:16 localhost podman[43982]: 2025-12-06 08:10:16.437164807 +0000 UTC m=+0.121866780 container cleanup 66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=container-puppet-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_puppet_step1, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-collectd) Dec 6 03:10:16 localhost systemd[1]: libpod-conmon-66a92579451e543a5a460abde3b8985456a4866ecb67c6fa0d3e98a43e88e7c9.scope: Deactivated successfully. Dec 6 03:10:16 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 6 03:10:16 localhost podman[44073]: 2025-12-06 08:10:16.618426617 +0000 UTC m=+0.125137395 container create 59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-rsyslog, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git) Dec 6 03:10:16 localhost systemd[1]: Started libpod-conmon-59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9.scope. Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}5583ffaf2092b6e93a366c0e0a5ae8bf085fd0323453fa6c1700001c8f312259' Dec 6 03:10:16 localhost systemd[1]: Started libcrun container. Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Dec 6 03:10:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4981cf17a16268f72c4597c6d6e81a28a8813fbea2307a4606ef86987db2b2e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Dec 6 03:10:16 localhost podman[44073]: 2025-12-06 08:10:16.676393012 +0000 UTC m=+0.183103800 container init 59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, version=17.1.12) Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Dec 6 03:10:16 localhost podman[44073]: 2025-12-06 08:10:16.586739063 +0000 UTC m=+0.093449851 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 6 03:10:16 localhost podman[44073]: 2025-12-06 08:10:16.688117197 +0000 UTC m=+0.194827995 container start 59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, container_name=container-puppet-rsyslog, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:10:16 localhost podman[44073]: 2025-12-06 08:10:16.688331314 +0000 UTC m=+0.195042092 container attach 59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, container_name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, version=17.1.12, name=rhosp17/openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=) Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/libvirt/auth.conf]/ensure: defined content as '{sha256}ce52d34be42524114fb098afc3c38a37190a8fffd86438fec0a9fa655bb094aa' Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}5a245fb66f34390b336618c227bbf82a47dc97aea1db4d1fea791ef5e596d271' Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Dec 6 03:10:16 localhost podman[44129]: 2025-12-06 08:10:16.738897482 +0000 UTC m=+0.119937269 container create 2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, release=1761123044) Dec 6 03:10:16 localhost podman[44129]: 2025-12-06 08:10:16.652798616 +0000 UTC m=+0.033838393 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Dec 6 03:10:16 localhost systemd[1]: Started libpod-conmon-2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76.scope. Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Dec 6 03:10:16 localhost systemd[1]: Started libcrun container. Dec 6 03:10:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c1bfe83dd7f6989dbf63d618213f5724e0c166ed4a1029a29bf4ae0f3edd705/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c1bfe83dd7f6989dbf63d618213f5724e0c166ed4a1029a29bf4ae0f3edd705/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Dec 6 03:10:16 localhost podman[44129]: 2025-12-06 08:10:16.83794541 +0000 UTC m=+0.218985197 container init 2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1) Dec 6 03:10:16 localhost podman[44129]: 2025-12-06 08:10:16.844350126 +0000 UTC m=+0.225389933 container start 2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, container_name=container-puppet-ovn_controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:10:16 localhost podman[44129]: 2025-12-06 08:10:16.844562593 +0000 UTC m=+0.225602410 container attach 2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=container-puppet-ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_puppet_step1, name=rhosp17/openstack-ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Dec 6 03:10:16 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:17 localhost puppet-user[43775]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:17 localhost puppet-user[43775]: (file & line not available) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:17 localhost puppet-user[43775]: (file & line not available) Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Dec 6 03:10:17 localhost puppet-user[43775]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.38 seconds Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Dec 6 03:10:17 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Dec 6 03:10:17 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_with_native_tls]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Dec 6 03:10:18 localhost puppet-user[43775]: Notice: Applied catalog in 0.43 seconds Dec 6 03:10:18 localhost puppet-user[43775]: Application: Dec 6 03:10:18 localhost puppet-user[43775]: Initial environment: production Dec 6 03:10:18 localhost puppet-user[43775]: Converged environment: production Dec 6 03:10:18 localhost puppet-user[43775]: Run mode: user Dec 6 03:10:18 localhost puppet-user[43775]: Changes: Dec 6 03:10:18 localhost puppet-user[43775]: Total: 31 Dec 6 03:10:18 localhost puppet-user[43775]: Events: Dec 6 03:10:18 localhost puppet-user[43775]: Success: 31 Dec 6 03:10:18 localhost puppet-user[43775]: Total: 31 Dec 6 03:10:18 localhost puppet-user[43775]: Resources: Dec 6 03:10:18 localhost puppet-user[43775]: Skipped: 22 Dec 6 03:10:18 localhost puppet-user[43775]: Changed: 31 Dec 6 03:10:18 localhost puppet-user[43775]: Out of sync: 31 Dec 6 03:10:18 localhost puppet-user[43775]: Total: 150 Dec 6 03:10:18 localhost puppet-user[43775]: Time: Dec 6 03:10:18 localhost puppet-user[43775]: Ceilometer config: 0.37 Dec 6 03:10:18 localhost puppet-user[43775]: Transaction evaluation: 0.42 Dec 6 03:10:18 localhost puppet-user[43775]: Catalog application: 0.43 Dec 6 03:10:18 localhost puppet-user[43775]: Config retrieval: 0.46 Dec 6 03:10:18 localhost puppet-user[43775]: Last run: 1765008618 Dec 6 03:10:18 localhost puppet-user[43775]: Resources: 0.00 Dec 6 03:10:18 localhost puppet-user[43775]: Total: 0.43 Dec 6 03:10:18 localhost puppet-user[43775]: Version: Dec 6 03:10:18 localhost puppet-user[43775]: Config: 1765008617 Dec 6 03:10:18 localhost puppet-user[43775]: Puppet: 7.10.0 Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[auth_tls]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_addr]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/File[/etc/sysconfig/libvirtd]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Libvirt/File_line[/etc/sysconfig/libvirtd libvirtd args]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Dec 6 03:10:18 localhost puppet-user[44186]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:18 localhost puppet-user[44186]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:18 localhost puppet-user[44186]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:18 localhost puppet-user[44186]: (file & line not available) Dec 6 03:10:18 localhost puppet-user[44186]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:18 localhost puppet-user[44186]: (file & line not available) Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[44164]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:18 localhost puppet-user[44164]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[44164]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:18 localhost puppet-user[44164]: (file & line not available) Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Dec 6 03:10:18 localhost systemd[1]: libpod-b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc.scope: Deactivated successfully. Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Dec 6 03:10:18 localhost systemd[1]: libpod-b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc.scope: Consumed 2.972s CPU time. Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Dec 6 03:10:18 localhost puppet-user[44164]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:18 localhost puppet-user[44164]: (file & line not available) Dec 6 03:10:18 localhost podman[44430]: 2025-12-06 08:10:18.826817378 +0000 UTC m=+0.035652401 container died b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-central, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_puppet_step1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:11:59Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible) Dec 6 03:10:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:18 localhost systemd[1]: var-lib-containers-storage-overlay-bedaabc8d1e6b69e205ba04f54934e85f233d1071cfe4f6fa7419a243a5303a2-merged.mount: Deactivated successfully. Dec 6 03:10:18 localhost podman[44430]: 2025-12-06 08:10:18.871306812 +0000 UTC m=+0.080141795 container cleanup b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-central-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-central, release=1761123044, build-date=2025-11-19T00:11:59Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_puppet_step1, container_name=container-puppet-ceilometer) Dec 6 03:10:18 localhost systemd[1]: libpod-conmon-b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc.scope: Deactivated successfully. Dec 6 03:10:18 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Dec 6 03:10:18 localhost puppet-user[44186]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.29 seconds Dec 6 03:10:18 localhost puppet-user[44164]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.25 seconds Dec 6 03:10:18 localhost ovs-vsctl[44490]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=ssl:172.17.0.103:6642,ssl:172.17.0.104:6642,ssl:172.17.0.105:6642 Dec 6 03:10:18 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Dec 6 03:10:18 localhost ovs-vsctl[44496]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Dec 6 03:10:18 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44498]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44501]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005548798.ooo.test Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005548798.novalocal' to 'np0005548798.ooo.test' Dec 6 03:10:19 localhost puppet-user[44164]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Dec 6 03:10:19 localhost puppet-user[44164]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Dec 6 03:10:19 localhost puppet-user[44164]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}4b689b4d914ce7c8055af1f29ecd92a9c46d9a7dacd5bf37efce269f2fc54bc6' Dec 6 03:10:19 localhost ovs-vsctl[44503]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Dec 6 03:10:19 localhost puppet-user[44164]: Notice: Applied catalog in 0.10 seconds Dec 6 03:10:19 localhost puppet-user[44164]: Application: Dec 6 03:10:19 localhost puppet-user[44164]: Initial environment: production Dec 6 03:10:19 localhost puppet-user[44164]: Converged environment: production Dec 6 03:10:19 localhost puppet-user[44164]: Run mode: user Dec 6 03:10:19 localhost puppet-user[44164]: Changes: Dec 6 03:10:19 localhost puppet-user[44164]: Total: 3 Dec 6 03:10:19 localhost puppet-user[44164]: Events: Dec 6 03:10:19 localhost puppet-user[44164]: Success: 3 Dec 6 03:10:19 localhost puppet-user[44164]: Total: 3 Dec 6 03:10:19 localhost puppet-user[44164]: Resources: Dec 6 03:10:19 localhost puppet-user[44164]: Skipped: 11 Dec 6 03:10:19 localhost puppet-user[44164]: Changed: 3 Dec 6 03:10:19 localhost puppet-user[44164]: Out of sync: 3 Dec 6 03:10:19 localhost puppet-user[44164]: Total: 25 Dec 6 03:10:19 localhost puppet-user[44164]: Time: Dec 6 03:10:19 localhost puppet-user[44164]: Concat file: 0.00 Dec 6 03:10:19 localhost puppet-user[44164]: Concat fragment: 0.00 Dec 6 03:10:19 localhost puppet-user[44164]: File: 0.01 Dec 6 03:10:19 localhost puppet-user[44164]: Transaction evaluation: 0.09 Dec 6 03:10:19 localhost puppet-user[44164]: Catalog application: 0.10 Dec 6 03:10:19 localhost puppet-user[44164]: Config retrieval: 0.29 Dec 6 03:10:19 localhost puppet-user[44164]: Last run: 1765008619 Dec 6 03:10:19 localhost puppet-user[44164]: Total: 0.10 Dec 6 03:10:19 localhost puppet-user[44164]: Version: Dec 6 03:10:19 localhost puppet-user[44164]: Config: 1765008618 Dec 6 03:10:19 localhost puppet-user[44164]: Puppet: 7.10.0 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44505]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44507]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44515]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44517]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44522]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44524]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:04:8e:57 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Dec 6 03:10:19 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Dec 6 03:10:19 localhost ovs-vsctl[44526]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44538]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Dec 6 03:10:19 localhost ovs-vsctl[44544]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Dec 6 03:10:19 localhost puppet-user[44186]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Dec 6 03:10:19 localhost puppet-user[44186]: Notice: Applied catalog in 0.45 seconds Dec 6 03:10:19 localhost puppet-user[44186]: Application: Dec 6 03:10:19 localhost puppet-user[44186]: Initial environment: production Dec 6 03:10:19 localhost puppet-user[44186]: Converged environment: production Dec 6 03:10:19 localhost puppet-user[44186]: Run mode: user Dec 6 03:10:19 localhost puppet-user[44186]: Changes: Dec 6 03:10:19 localhost puppet-user[44186]: Total: 14 Dec 6 03:10:19 localhost puppet-user[44186]: Events: Dec 6 03:10:19 localhost puppet-user[44186]: Success: 14 Dec 6 03:10:19 localhost puppet-user[44186]: Total: 14 Dec 6 03:10:19 localhost puppet-user[44186]: Resources: Dec 6 03:10:19 localhost puppet-user[44186]: Skipped: 12 Dec 6 03:10:19 localhost puppet-user[44186]: Changed: 14 Dec 6 03:10:19 localhost puppet-user[44186]: Out of sync: 14 Dec 6 03:10:19 localhost puppet-user[44186]: Total: 29 Dec 6 03:10:19 localhost puppet-user[44186]: Time: Dec 6 03:10:19 localhost puppet-user[44186]: Exec: 0.01 Dec 6 03:10:19 localhost puppet-user[44186]: Config retrieval: 0.33 Dec 6 03:10:19 localhost puppet-user[44186]: Vs config: 0.39 Dec 6 03:10:19 localhost puppet-user[44186]: Transaction evaluation: 0.45 Dec 6 03:10:19 localhost puppet-user[44186]: Catalog application: 0.45 Dec 6 03:10:19 localhost puppet-user[44186]: Last run: 1765008619 Dec 6 03:10:19 localhost puppet-user[44186]: Total: 0.46 Dec 6 03:10:19 localhost puppet-user[44186]: Version: Dec 6 03:10:19 localhost puppet-user[44186]: Config: 1765008618 Dec 6 03:10:19 localhost puppet-user[44186]: Puppet: 7.10.0 Dec 6 03:10:19 localhost systemd[1]: libpod-59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9.scope: Deactivated successfully. Dec 6 03:10:19 localhost systemd[1]: libpod-59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9.scope: Consumed 2.618s CPU time. Dec 6 03:10:19 localhost podman[44073]: 2025-12-06 08:10:19.408618205 +0000 UTC m=+2.915328973 container died 59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_puppet_step1, version=17.1.12, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:10:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:19 localhost podman[44560]: 2025-12-06 08:10:19.535668209 +0000 UTC m=+0.119037089 container cleanup 59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=container-puppet-rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Dec 6 03:10:19 localhost systemd[1]: libpod-conmon-59a49fa217b8d99bad5e8fb2e4892669752d33330b93780fc26f2ae149e152f9.scope: Deactivated successfully. Dec 6 03:10:19 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 6 03:10:19 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Dec 6 03:10:19 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Dec 6 03:10:19 localhost systemd[1]: var-lib-containers-storage-overlay-a4981cf17a16268f72c4597c6d6e81a28a8813fbea2307a4606ef86987db2b2e-merged.mount: Deactivated successfully. Dec 6 03:10:19 localhost systemd[1]: libpod-2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76.scope: Deactivated successfully. Dec 6 03:10:19 localhost systemd[1]: libpod-2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76.scope: Consumed 2.812s CPU time. Dec 6 03:10:19 localhost podman[44129]: 2025-12-06 08:10:19.846701071 +0000 UTC m=+3.227740888 container died 2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:10:19 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Dec 6 03:10:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:20 localhost systemd[1]: var-lib-containers-storage-overlay-3c1bfe83dd7f6989dbf63d618213f5724e0c166ed4a1029a29bf4ae0f3edd705-merged.mount: Deactivated successfully. Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Dec 6 03:10:20 localhost podman[44643]: 2025-12-06 08:10:20.874528229 +0000 UTC m=+1.017436796 container cleanup 2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, managed_by=tripleo_ansible, container_name=container-puppet-ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:10:20 localhost systemd[1]: libpod-conmon-2ca6b3fc29403d350b36387e6cdb3a964158a9c59e6490b7bc7ebb1263f07f76.scope: Deactivated successfully. Dec 6 03:10:20 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 6 03:10:20 localhost podman[44263]: 2025-12-06 08:10:17.657119272 +0000 UTC m=+0.031312592 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Dec 6 03:10:20 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Dec 6 03:10:21 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Dec 6 03:10:21 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Dec 6 03:10:21 localhost puppet-user[43326]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Dec 6 03:10:21 localhost puppet-user[43326]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}54b6514260e1079fa542a5fb3f63ddc81098e05c7f8891bd4d656a6c4b4724a7' Dec 6 03:10:21 localhost puppet-user[43326]: Notice: Applied catalog in 4.50 seconds Dec 6 03:10:21 localhost puppet-user[43326]: Application: Dec 6 03:10:21 localhost puppet-user[43326]: Initial environment: production Dec 6 03:10:21 localhost puppet-user[43326]: Converged environment: production Dec 6 03:10:21 localhost puppet-user[43326]: Run mode: user Dec 6 03:10:21 localhost puppet-user[43326]: Changes: Dec 6 03:10:21 localhost puppet-user[43326]: Total: 178 Dec 6 03:10:21 localhost puppet-user[43326]: Events: Dec 6 03:10:21 localhost puppet-user[43326]: Success: 178 Dec 6 03:10:21 localhost puppet-user[43326]: Total: 178 Dec 6 03:10:21 localhost puppet-user[43326]: Resources: Dec 6 03:10:21 localhost puppet-user[43326]: Changed: 178 Dec 6 03:10:21 localhost puppet-user[43326]: Out of sync: 178 Dec 6 03:10:21 localhost puppet-user[43326]: Skipped: 52 Dec 6 03:10:21 localhost puppet-user[43326]: Total: 474 Dec 6 03:10:21 localhost puppet-user[43326]: Time: Dec 6 03:10:21 localhost puppet-user[43326]: Resources: 0.00 Dec 6 03:10:21 localhost puppet-user[43326]: Concat fragment: 0.00 Dec 6 03:10:21 localhost puppet-user[43326]: Anchor: 0.00 Dec 6 03:10:21 localhost puppet-user[43326]: File line: 0.00 Dec 6 03:10:21 localhost puppet-user[43326]: Virtlogd config: 0.00 Dec 6 03:10:21 localhost puppet-user[43326]: Virtqemud config: 0.01 Dec 6 03:10:21 localhost puppet-user[43326]: Virtsecretd config: 0.01 Dec 6 03:10:21 localhost puppet-user[43326]: Virtstoraged config: 0.02 Dec 6 03:10:21 localhost puppet-user[43326]: Virtnodedevd config: 0.02 Dec 6 03:10:21 localhost puppet-user[43326]: Exec: 0.02 Dec 6 03:10:21 localhost puppet-user[43326]: File: 0.02 Dec 6 03:10:21 localhost puppet-user[43326]: Virtproxyd config: 0.03 Dec 6 03:10:21 localhost puppet-user[43326]: Augeas: 1.01 Dec 6 03:10:21 localhost puppet-user[43326]: Config retrieval: 1.39 Dec 6 03:10:21 localhost puppet-user[43326]: Last run: 1765008621 Dec 6 03:10:21 localhost puppet-user[43326]: Nova config: 3.13 Dec 6 03:10:21 localhost puppet-user[43326]: Transaction evaluation: 4.49 Dec 6 03:10:21 localhost puppet-user[43326]: Catalog application: 4.50 Dec 6 03:10:21 localhost puppet-user[43326]: Concat file: 0.00 Dec 6 03:10:21 localhost puppet-user[43326]: Total: 4.51 Dec 6 03:10:21 localhost puppet-user[43326]: Version: Dec 6 03:10:21 localhost puppet-user[43326]: Config: 1765008615 Dec 6 03:10:21 localhost puppet-user[43326]: Puppet: 7.10.0 Dec 6 03:10:21 localhost podman[44707]: 2025-12-06 08:10:21.129682813 +0000 UTC m=+0.085727384 container create d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-server, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-server-container, build-date=2025-11-19T00:23:27Z, container_name=container-puppet-neutron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64) Dec 6 03:10:21 localhost systemd[1]: Started libpod-conmon-d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34.scope. Dec 6 03:10:21 localhost systemd[1]: Started libcrun container. Dec 6 03:10:21 localhost podman[44707]: 2025-12-06 08:10:21.080563171 +0000 UTC m=+0.036607792 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Dec 6 03:10:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa2c3e8c20dbc398894a992c8d0207ecad87158b45491e83ea3c6f6e44b17b0b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:21 localhost podman[44707]: 2025-12-06 08:10:21.191824252 +0000 UTC m=+0.147868823 container init d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, managed_by=tripleo_ansible, build-date=2025-11-19T00:23:27Z, batch=17.1_20251118.1, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, container_name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4) Dec 6 03:10:21 localhost podman[44707]: 2025-12-06 08:10:21.204103984 +0000 UTC m=+0.160148565 container start d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:23:27Z, container_name=container-puppet-neutron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_id=tripleo_puppet_step1) Dec 6 03:10:21 localhost podman[44707]: 2025-12-06 08:10:21.204434495 +0000 UTC m=+0.160479066 container attach d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:23:27Z, container_name=container-puppet-neutron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-server, release=1761123044, com.redhat.component=openstack-neutron-server-container, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server) Dec 6 03:10:21 localhost systemd[1]: libpod-f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2.scope: Deactivated successfully. Dec 6 03:10:21 localhost systemd[1]: libpod-f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2.scope: Consumed 8.313s CPU time. Dec 6 03:10:22 localhost podman[44773]: 2025-12-06 08:10:22.020430894 +0000 UTC m=+0.036213010 container died f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 03:10:22 localhost podman[44773]: 2025-12-06 08:10:22.130663581 +0000 UTC m=+0.146445687 container cleanup f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, container_name=container-puppet-nova_libvirt, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git) Dec 6 03:10:22 localhost systemd[1]: var-lib-containers-storage-overlay-7603e0cea2115c7f1b9c23d551d73dbd4a38d1911aefa6f002a0812093908818-merged.mount: Deactivated successfully. Dec 6 03:10:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:22 localhost systemd[1]: libpod-conmon-f52d8f26cb94915ae2691dd184405f65c9d4115fc6ec111361cc8d910f7fe2b2.scope: Deactivated successfully. Dec 6 03:10:22 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:10:22 localhost puppet-user[44737]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Dec 6 03:10:23 localhost puppet-user[44737]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:10:23 localhost puppet-user[44737]: (file: /etc/puppet/hiera.yaml) Dec 6 03:10:23 localhost puppet-user[44737]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:10:23 localhost puppet-user[44737]: (file & line not available) Dec 6 03:10:23 localhost puppet-user[44737]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:10:23 localhost puppet-user[44737]: (file & line not available) Dec 6 03:10:23 localhost puppet-user[44737]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Dec 6 03:10:23 localhost puppet-user[44737]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.62 seconds Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_private_key]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_certificate]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_ca_cert]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_private_key]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_certificate]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_ca_cert]/ensure: created Dec 6 03:10:23 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Dec 6 03:10:24 localhost puppet-user[44737]: Notice: Applied catalog in 0.46 seconds Dec 6 03:10:24 localhost puppet-user[44737]: Application: Dec 6 03:10:24 localhost puppet-user[44737]: Initial environment: production Dec 6 03:10:24 localhost puppet-user[44737]: Converged environment: production Dec 6 03:10:24 localhost puppet-user[44737]: Run mode: user Dec 6 03:10:24 localhost puppet-user[44737]: Changes: Dec 6 03:10:24 localhost puppet-user[44737]: Total: 39 Dec 6 03:10:24 localhost puppet-user[44737]: Events: Dec 6 03:10:24 localhost puppet-user[44737]: Success: 39 Dec 6 03:10:24 localhost puppet-user[44737]: Total: 39 Dec 6 03:10:24 localhost puppet-user[44737]: Resources: Dec 6 03:10:24 localhost puppet-user[44737]: Skipped: 21 Dec 6 03:10:24 localhost puppet-user[44737]: Changed: 39 Dec 6 03:10:24 localhost puppet-user[44737]: Out of sync: 39 Dec 6 03:10:24 localhost puppet-user[44737]: Total: 155 Dec 6 03:10:24 localhost puppet-user[44737]: Time: Dec 6 03:10:24 localhost puppet-user[44737]: Resources: 0.00 Dec 6 03:10:24 localhost puppet-user[44737]: Ovn metadata agent config: 0.02 Dec 6 03:10:24 localhost puppet-user[44737]: Neutron config: 0.38 Dec 6 03:10:24 localhost puppet-user[44737]: Transaction evaluation: 0.46 Dec 6 03:10:24 localhost puppet-user[44737]: Catalog application: 0.46 Dec 6 03:10:24 localhost puppet-user[44737]: Config retrieval: 0.69 Dec 6 03:10:24 localhost puppet-user[44737]: Last run: 1765008624 Dec 6 03:10:24 localhost puppet-user[44737]: Total: 0.46 Dec 6 03:10:24 localhost puppet-user[44737]: Version: Dec 6 03:10:24 localhost puppet-user[44737]: Config: 1765008623 Dec 6 03:10:24 localhost puppet-user[44737]: Puppet: 7.10.0 Dec 6 03:10:24 localhost systemd[1]: libpod-d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34.scope: Deactivated successfully. Dec 6 03:10:24 localhost systemd[1]: libpod-d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34.scope: Consumed 3.615s CPU time. Dec 6 03:10:24 localhost podman[44707]: 2025-12-06 08:10:24.83593646 +0000 UTC m=+3.791981071 container died d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, container_name=container-puppet-neutron, name=rhosp17/openstack-neutron-server, config_id=tripleo_puppet_step1, com.redhat.component=openstack-neutron-server-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server) Dec 6 03:10:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34-userdata-shm.mount: Deactivated successfully. Dec 6 03:10:24 localhost systemd[1]: var-lib-containers-storage-overlay-fa2c3e8c20dbc398894a992c8d0207ecad87158b45491e83ea3c6f6e44b17b0b-merged.mount: Deactivated successfully. Dec 6 03:10:24 localhost podman[44920]: 2025-12-06 08:10:24.964143613 +0000 UTC m=+0.117594914 container cleanup d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, container_name=container-puppet-neutron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-neutron-server-container, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, release=1761123044) Dec 6 03:10:24 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:10:24 localhost systemd[1]: libpod-conmon-d952395c7d816074dbc33467463bba7054fbe968285e54fb91f0f6a8b04e1c34.scope: Deactivated successfully. Dec 6 03:10:24 localhost python3[43025]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548798 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548798', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Dec 6 03:10:25 localhost python3[44973]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:26 localhost python3[45005]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:10:27 localhost python3[45055]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:10:27 localhost python3[45098]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008627.0518715-112223-137703980473875/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:28 localhost python3[45160]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:10:28 localhost python3[45203]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008628.0207198-112223-15462904207783/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:29 localhost python3[45265]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:10:29 localhost python3[45308]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008629.0551848-112436-66633536700702/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:30 localhost python3[45370]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:10:31 localhost python3[45413]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008630.03482-112518-27116116556052/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:32 localhost python3[45443]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:10:32 localhost systemd[1]: Reloading. Dec 6 03:10:32 localhost systemd-rc-local-generator[45465]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:10:32 localhost systemd-sysv-generator[45471]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:10:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:10:32 localhost systemd[1]: Reloading. Dec 6 03:10:32 localhost systemd-rc-local-generator[45504]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:10:32 localhost systemd-sysv-generator[45508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:10:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:10:32 localhost systemd[1]: Starting TripleO Container Shutdown... Dec 6 03:10:32 localhost systemd[1]: Finished TripleO Container Shutdown. Dec 6 03:10:33 localhost python3[45567]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:10:33 localhost python3[45610]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008633.0839367-112734-231659979818721/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:34 localhost python3[45672]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:10:34 localhost python3[45715]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008634.1408184-112804-159544217858505/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:35 localhost python3[45745]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:10:35 localhost systemd[1]: Reloading. Dec 6 03:10:35 localhost systemd-sysv-generator[45777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:10:35 localhost systemd-rc-local-generator[45771]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:10:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:10:35 localhost systemd[1]: Reloading. Dec 6 03:10:35 localhost systemd-rc-local-generator[45815]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:10:35 localhost systemd-sysv-generator[45818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:10:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:10:35 localhost systemd[1]: Starting Create netns directory... Dec 6 03:10:35 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 03:10:35 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 03:10:35 localhost systemd[1]: Finished Create netns directory. Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 1d175b5c6581de7cf9d966b234ba0e8a Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: 5923e560c9d95c3eb077adacead52760 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 4e7ada4fab3991cc27fb5f75a09b7e0f Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 356f6a60d2bd0022f54ca41a3c5253ab Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 52b849225b4338d04445dda705a9a8bc Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 52b849225b4338d04445dda705a9a8bc Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 78ca993e795bb2768fe880e03926b595 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:36 localhost python3[45838]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: f81b1d391c9b63868054d7733e636be7 Dec 6 03:10:37 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:10:37 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:10:37 localhost python3[45893]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 6 03:10:38 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:10:39 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 6 03:10:39 localhost podman[45930]: 2025-12-06 08:10:39.199129696 +0000 UTC m=+0.073197554 container create f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr_init_logs, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:10:39 localhost systemd[1]: Started libpod-conmon-f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3.scope. Dec 6 03:10:39 localhost systemd[1]: Started libcrun container. Dec 6 03:10:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc6ccae8d1859158b3bbd7185cd50b8ac3a8ab8c86ff1ef056ca16ec9c2e0699/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:39 localhost podman[45930]: 2025-12-06 08:10:39.161643976 +0000 UTC m=+0.035711894 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 6 03:10:39 localhost podman[45930]: 2025-12-06 08:10:39.262673838 +0000 UTC m=+0.136741706 container init f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr_init_logs, vcs-type=git, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:10:39 localhost podman[45930]: 2025-12-06 08:10:39.27304028 +0000 UTC m=+0.147108168 container start f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, container_name=metrics_qdr_init_logs, config_id=tripleo_step1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1) Dec 6 03:10:39 localhost podman[45930]: 2025-12-06 08:10:39.273465184 +0000 UTC m=+0.147533052 container attach f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr_init_logs) Dec 6 03:10:39 localhost systemd[1]: libpod-f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3.scope: Deactivated successfully. Dec 6 03:10:39 localhost podman[45950]: 2025-12-06 08:10:39.340192329 +0000 UTC m=+0.048512623 container died f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:46Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, container_name=metrics_qdr_init_logs) Dec 6 03:10:39 localhost podman[45950]: 2025-12-06 08:10:39.366257913 +0000 UTC m=+0.074578167 container cleanup f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4) Dec 6 03:10:39 localhost systemd[1]: libpod-conmon-f7c38dbe1df8638300b38dd6f6c25b2bc52aa9d95b857ab3b0c70486985640a3.scope: Deactivated successfully. Dec 6 03:10:39 localhost python3[45893]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Dec 6 03:10:39 localhost podman[46027]: 2025-12-06 08:10:39.859165194 +0000 UTC m=+0.080985412 container create f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:10:39 localhost systemd[1]: Started libpod-conmon-f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.scope. Dec 6 03:10:39 localhost systemd[1]: Started libcrun container. Dec 6 03:10:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e783b457a0dbabf469167b8c7c7fc00f0087efa2180c519aab4a9fcb73c3a343/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e783b457a0dbabf469167b8c7c7fc00f0087efa2180c519aab4a9fcb73c3a343/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Dec 6 03:10:39 localhost podman[46027]: 2025-12-06 08:10:39.825060723 +0000 UTC m=+0.046880991 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 6 03:10:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:10:39 localhost podman[46027]: 2025-12-06 08:10:39.944095041 +0000 UTC m=+0.165915279 container init f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:10:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:10:39 localhost podman[46027]: 2025-12-06 08:10:39.98153105 +0000 UTC m=+0.203351288 container start f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:10:39 localhost python3[45893]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1d175b5c6581de7cf9d966b234ba0e8a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z --volume /etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro --volume /etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 6 03:10:40 localhost podman[46049]: 2025-12-06 08:10:40.073468742 +0000 UTC m=+0.081565831 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, container_name=metrics_qdr, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, maintainer=OpenStack TripleO Team) Dec 6 03:10:40 localhost podman[46049]: 2025-12-06 08:10:40.294056789 +0000 UTC m=+0.302153908 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20251118.1) Dec 6 03:10:40 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:10:40 localhost python3[46121]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:40 localhost python3[46137]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:10:41 localhost python3[46198]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008640.9383268-113358-199294224256329/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:41 localhost python3[46214]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 03:10:41 localhost systemd[1]: Reloading. Dec 6 03:10:42 localhost systemd-sysv-generator[46241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:10:42 localhost systemd-rc-local-generator[46238]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:10:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:10:42 localhost python3[46265]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:10:42 localhost systemd[1]: Reloading. Dec 6 03:10:42 localhost systemd-sysv-generator[46298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:10:42 localhost systemd-rc-local-generator[46294]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:10:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:10:43 localhost systemd[1]: Starting metrics_qdr container... Dec 6 03:10:43 localhost systemd[1]: Started metrics_qdr container. Dec 6 03:10:43 localhost python3[46346]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:45 localhost python3[46467]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005548798 step=1 update_config_hash_only=False Dec 6 03:10:45 localhost python3[46483]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:10:46 localhost python3[46499]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 6 03:11:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:11:10 localhost podman[46502]: 2025-12-06 08:11:10.556353431 +0000 UTC m=+0.090009406 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, version=17.1.12, release=1761123044, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64) Dec 6 03:11:10 localhost podman[46502]: 2025-12-06 08:11:10.772352272 +0000 UTC m=+0.306008227 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 6 03:11:10 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:11:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:11:41 localhost podman[46531]: 2025-12-06 08:11:41.560239255 +0000 UTC m=+0.097678621 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:46Z) Dec 6 03:11:41 localhost podman[46531]: 2025-12-06 08:11:41.795328606 +0000 UTC m=+0.332767892 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd) Dec 6 03:11:41 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:12:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:12:12 localhost podman[46558]: 2025-12-06 08:12:12.550601349 +0000 UTC m=+0.084313493 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr, url=https://www.redhat.com) Dec 6 03:12:12 localhost podman[46558]: 2025-12-06 08:12:12.76322759 +0000 UTC m=+0.296939674 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:12:12 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:12:43 localhost systemd[1]: tmp-crun.DoTK7w.mount: Deactivated successfully. Dec 6 03:12:43 localhost podman[46587]: 2025-12-06 08:12:43.55389941 +0000 UTC m=+0.085204840 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:12:43 localhost podman[46587]: 2025-12-06 08:12:43.727100481 +0000 UTC m=+0.258405881 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:12:43 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:13:14 localhost systemd[1]: tmp-crun.2PuDc2.mount: Deactivated successfully. Dec 6 03:13:14 localhost podman[46617]: 2025-12-06 08:13:14.524653982 +0000 UTC m=+0.062437372 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container) Dec 6 03:13:14 localhost podman[46617]: 2025-12-06 08:13:14.693003419 +0000 UTC m=+0.230786779 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Dec 6 03:13:14 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:13:45 localhost podman[46647]: 2025-12-06 08:13:45.57298004 +0000 UTC m=+0.108494895 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 03:13:45 localhost podman[46647]: 2025-12-06 08:13:45.796328698 +0000 UTC m=+0.331843603 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:13:45 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:14:16 localhost podman[46676]: 2025-12-06 08:14:16.528137921 +0000 UTC m=+0.066701461 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Dec 6 03:14:16 localhost podman[46676]: 2025-12-06 08:14:16.702308768 +0000 UTC m=+0.240872128 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 03:14:16 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:14:47 localhost podman[46705]: 2025-12-06 08:14:47.539407099 +0000 UTC m=+0.074991413 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:14:47 localhost podman[46705]: 2025-12-06 08:14:47.728432497 +0000 UTC m=+0.264016811 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, version=17.1.12, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:14:47 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:15:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:15:18 localhost podman[46735]: 2025-12-06 08:15:18.539894845 +0000 UTC m=+0.075971016 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044) Dec 6 03:15:18 localhost podman[46735]: 2025-12-06 08:15:18.727173374 +0000 UTC m=+0.263249535 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, name=rhosp17/openstack-qdrouterd, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, release=1761123044) Dec 6 03:15:18 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:15:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:15:49 localhost podman[46765]: 2025-12-06 08:15:49.549353539 +0000 UTC m=+0.081077802 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:15:49 localhost podman[46765]: 2025-12-06 08:15:49.759379468 +0000 UTC m=+0.291103731 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:15:49 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:16:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:16:20 localhost podman[46795]: 2025-12-06 08:16:20.549965325 +0000 UTC m=+0.085080769 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:16:20 localhost podman[46795]: 2025-12-06 08:16:20.753376795 +0000 UTC m=+0.288492279 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1) Dec 6 03:16:20 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:16:23 localhost python3[46872]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:23 localhost python3[46917]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008982.8797445-121004-191974703420554/source _original_basename=tmpbi8oyph9 follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:24 localhost python3[46979]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:25 localhost python3[47022]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008984.3611822-121198-68999728944393/source _original_basename=tmp2na5dj1g follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:25 localhost python3[47052]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Dec 6 03:16:25 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 77.2 (257 of 333 items), suggesting rotation. Dec 6 03:16:25 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 03:16:25 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 03:16:25 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 03:16:25 localhost python3[47071]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:16:27 localhost ansible-async_wrapper.py[47243]: Invoked with 778831603138 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008986.9770484-121406-98409971077146/AnsiballZ_command.py _ Dec 6 03:16:27 localhost ansible-async_wrapper.py[47246]: Starting module and watcher Dec 6 03:16:27 localhost ansible-async_wrapper.py[47246]: Start watching 47247 (3600) Dec 6 03:16:27 localhost ansible-async_wrapper.py[47247]: Start module (47247) Dec 6 03:16:27 localhost ansible-async_wrapper.py[47243]: Return async_wrapper task started. Dec 6 03:16:27 localhost python3[47267]: ansible-ansible.legacy.async_status Invoked with jid=778831603138.47243 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:16:32 localhost ansible-async_wrapper.py[47246]: 47247 still running (3600) Dec 6 03:16:32 localhost puppet-user[47266]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:16:32 localhost puppet-user[47266]: (file: /etc/puppet/hiera.yaml) Dec 6 03:16:32 localhost puppet-user[47266]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:16:32 localhost puppet-user[47266]: (file & line not available) Dec 6 03:16:32 localhost puppet-user[47266]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:16:32 localhost puppet-user[47266]: (file & line not available) Dec 6 03:16:32 localhost puppet-user[47266]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 6 03:16:33 localhost puppet-user[47266]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 6 03:16:33 localhost puppet-user[47266]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.12 seconds Dec 6 03:16:33 localhost puppet-user[47266]: Notice: Applied catalog in 0.04 seconds Dec 6 03:16:33 localhost puppet-user[47266]: Application: Dec 6 03:16:33 localhost puppet-user[47266]: Initial environment: production Dec 6 03:16:33 localhost puppet-user[47266]: Converged environment: production Dec 6 03:16:33 localhost puppet-user[47266]: Run mode: user Dec 6 03:16:33 localhost puppet-user[47266]: Changes: Dec 6 03:16:33 localhost puppet-user[47266]: Events: Dec 6 03:16:33 localhost puppet-user[47266]: Resources: Dec 6 03:16:33 localhost puppet-user[47266]: Total: 10 Dec 6 03:16:33 localhost puppet-user[47266]: Time: Dec 6 03:16:33 localhost puppet-user[47266]: Schedule: 0.00 Dec 6 03:16:33 localhost puppet-user[47266]: File: 0.00 Dec 6 03:16:33 localhost puppet-user[47266]: Augeas: 0.01 Dec 6 03:16:33 localhost puppet-user[47266]: Exec: 0.01 Dec 6 03:16:33 localhost puppet-user[47266]: Transaction evaluation: 0.03 Dec 6 03:16:33 localhost puppet-user[47266]: Catalog application: 0.04 Dec 6 03:16:33 localhost puppet-user[47266]: Config retrieval: 0.15 Dec 6 03:16:33 localhost puppet-user[47266]: Last run: 1765008993 Dec 6 03:16:33 localhost puppet-user[47266]: Filebucket: 0.00 Dec 6 03:16:33 localhost puppet-user[47266]: Total: 0.05 Dec 6 03:16:33 localhost puppet-user[47266]: Version: Dec 6 03:16:33 localhost puppet-user[47266]: Config: 1765008992 Dec 6 03:16:33 localhost puppet-user[47266]: Puppet: 7.10.0 Dec 6 03:16:33 localhost ansible-async_wrapper.py[47247]: Module complete (47247) Dec 6 03:16:37 localhost ansible-async_wrapper.py[47246]: Done in kid B. Dec 6 03:16:38 localhost python3[47398]: ansible-ansible.legacy.async_status Invoked with jid=778831603138.47243 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:16:38 localhost python3[47414]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:16:39 localhost python3[47430]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:16:39 localhost python3[47480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:39 localhost python3[47498]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp1hqac7so recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:16:40 localhost python3[47528]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:41 localhost python3[47631]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 6 03:16:42 localhost python3[47650]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:43 localhost python3[47682]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:16:43 localhost python3[47732]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:44 localhost python3[47750]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:44 localhost python3[47812]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:44 localhost python3[47830]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:45 localhost python3[47892]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:45 localhost python3[47910]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:46 localhost python3[47972]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:46 localhost python3[47990]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:47 localhost python3[48020]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:16:47 localhost systemd[1]: Reloading. Dec 6 03:16:47 localhost systemd-rc-local-generator[48042]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:16:47 localhost systemd-sysv-generator[48047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:16:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:16:47 localhost python3[48107]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:48 localhost python3[48125]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:48 localhost python3[48187]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:16:49 localhost python3[48205]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:49 localhost python3[48235]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:16:49 localhost systemd[1]: Reloading. Dec 6 03:16:49 localhost systemd-sysv-generator[48263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:16:49 localhost systemd-rc-local-generator[48258]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:16:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:16:49 localhost systemd[1]: Starting Create netns directory... Dec 6 03:16:49 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 03:16:49 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 03:16:49 localhost systemd[1]: Finished Create netns directory. Dec 6 03:16:50 localhost python3[48292]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 6 03:16:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:16:51 localhost podman[48336]: 2025-12-06 08:16:51.555200495 +0000 UTC m=+0.087828286 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr) Dec 6 03:16:51 localhost podman[48336]: 2025-12-06 08:16:51.742225567 +0000 UTC m=+0.274853388 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:46Z, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:16:51 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:16:51 localhost python3[48381]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 6 03:16:52 localhost podman[48456]: 2025-12-06 08:16:52.03436475 +0000 UTC m=+0.076782945 container create 6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, container_name=nova_virtqemud_init_logs, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 03:16:52 localhost podman[48462]: 2025-12-06 08:16:52.070775876 +0000 UTC m=+0.098263550 container create 8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step2, batch=17.1_20251118.1, container_name=nova_compute_init_log, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Dec 6 03:16:52 localhost systemd[1]: Started libpod-conmon-6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7.scope. Dec 6 03:16:52 localhost podman[48456]: 2025-12-06 08:16:51.98941944 +0000 UTC m=+0.031837665 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:16:52 localhost systemd[1]: Started libcrun container. Dec 6 03:16:52 localhost systemd[1]: Started libpod-conmon-8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4.scope. Dec 6 03:16:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03eafd0c985fd00ac0465379e7194dd52dd90ec7c0ceb023e7be1171171dc496/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Dec 6 03:16:52 localhost systemd[1]: Started libcrun container. Dec 6 03:16:52 localhost podman[48456]: 2025-12-06 08:16:52.104772177 +0000 UTC m=+0.147190362 container init 6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtqemud_init_logs, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 03:16:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79368befea3ff2e81adcb0e0c13630267794aa8ed0c8a0948014dcb8ce0c10ed/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:16:52 localhost podman[48456]: 2025-12-06 08:16:52.111321699 +0000 UTC m=+0.153739894 container start 6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_virtqemud_init_logs, architecture=x86_64, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, vendor=Red Hat, Inc.) Dec 6 03:16:52 localhost python3[48381]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765007760 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Dec 6 03:16:52 localhost podman[48462]: 2025-12-06 08:16:52.117863691 +0000 UTC m=+0.145351365 container init 8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, container_name=nova_compute_init_log, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:16:52 localhost systemd[1]: libpod-6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7.scope: Deactivated successfully. Dec 6 03:16:52 localhost podman[48462]: 2025-12-06 08:16:52.022278056 +0000 UTC m=+0.049765790 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:16:52 localhost podman[48462]: 2025-12-06 08:16:52.127918862 +0000 UTC m=+0.155406526 container start 8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64) Dec 6 03:16:52 localhost systemd[1]: libpod-8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4.scope: Deactivated successfully. Dec 6 03:16:52 localhost python3[48381]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765007760 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Dec 6 03:16:52 localhost podman[48491]: 2025-12-06 08:16:52.164943157 +0000 UTC m=+0.040608066 container died 6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, container_name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Dec 6 03:16:52 localhost podman[48500]: 2025-12-06 08:16:52.212353733 +0000 UTC m=+0.073534954 container cleanup 6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud_init_logs, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step2, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z) Dec 6 03:16:52 localhost systemd[1]: libpod-conmon-6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7.scope: Deactivated successfully. Dec 6 03:16:52 localhost podman[48513]: 2025-12-06 08:16:52.255033872 +0000 UTC m=+0.106737331 container died 8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute_init_log, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:16:52 localhost podman[48513]: 2025-12-06 08:16:52.271862483 +0000 UTC m=+0.123565912 container cleanup 8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step2, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Dec 6 03:16:52 localhost systemd[1]: libpod-conmon-8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4.scope: Deactivated successfully. Dec 6 03:16:52 localhost systemd[1]: var-lib-containers-storage-overlay-79368befea3ff2e81adcb0e0c13630267794aa8ed0c8a0948014dcb8ce0c10ed-merged.mount: Deactivated successfully. Dec 6 03:16:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4-userdata-shm.mount: Deactivated successfully. Dec 6 03:16:52 localhost systemd[1]: var-lib-containers-storage-overlay-03eafd0c985fd00ac0465379e7194dd52dd90ec7c0ceb023e7be1171171dc496-merged.mount: Deactivated successfully. Dec 6 03:16:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b7b9343137a5483053de1f7685709e24bf4a668b297684d0630a784dc6f5db7-userdata-shm.mount: Deactivated successfully. Dec 6 03:16:52 localhost podman[48642]: 2025-12-06 08:16:52.690452455 +0000 UTC m=+0.061822902 container create cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, container_name=create_haproxy_wrapper, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true) Dec 6 03:16:52 localhost systemd[1]: Started libpod-conmon-cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8.scope. Dec 6 03:16:52 localhost podman[48643]: 2025-12-06 08:16:52.732244287 +0000 UTC m=+0.094234505 container create e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step2, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=create_virtlogd_wrapper, managed_by=tripleo_ansible) Dec 6 03:16:52 localhost systemd[1]: Started libcrun container. Dec 6 03:16:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9befc48c332654d0358552aae4c030fd23abbcc85f80b32354d85e375439ce4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 03:16:52 localhost systemd[1]: Started libpod-conmon-e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475.scope. Dec 6 03:16:52 localhost podman[48642]: 2025-12-06 08:16:52.659626672 +0000 UTC m=+0.030997139 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 6 03:16:52 localhost podman[48642]: 2025-12-06 08:16:52.765004739 +0000 UTC m=+0.136375226 container init cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=create_haproxy_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step2, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Dec 6 03:16:52 localhost systemd[1]: Started libcrun container. Dec 6 03:16:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edc3ecbc12a44c3c63d7e19fa2e89c15d1af79c98f2973226367ec81eae57350/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Dec 6 03:16:52 localhost podman[48642]: 2025-12-06 08:16:52.77955323 +0000 UTC m=+0.150923707 container start cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, container_name=create_haproxy_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 6 03:16:52 localhost podman[48642]: 2025-12-06 08:16:52.779924261 +0000 UTC m=+0.151294738 container attach cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, container_name=create_haproxy_wrapper, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public) Dec 6 03:16:52 localhost podman[48643]: 2025-12-06 08:16:52.68447143 +0000 UTC m=+0.046461678 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:16:52 localhost podman[48643]: 2025-12-06 08:16:52.786665959 +0000 UTC m=+0.148656187 container init e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=create_virtlogd_wrapper, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git) Dec 6 03:16:52 localhost podman[48643]: 2025-12-06 08:16:52.796933507 +0000 UTC m=+0.158923695 container start e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, release=1761123044, config_id=tripleo_step2, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=create_virtlogd_wrapper) Dec 6 03:16:52 localhost podman[48643]: 2025-12-06 08:16:52.797442763 +0000 UTC m=+0.159432941 container attach e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, container_name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Dec 6 03:16:54 localhost ovs-vsctl[48745]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Dec 6 03:16:54 localhost systemd[1]: libpod-e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475.scope: Deactivated successfully. Dec 6 03:16:54 localhost systemd[1]: libpod-e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475.scope: Consumed 2.019s CPU time. Dec 6 03:16:54 localhost podman[48643]: 2025-12-06 08:16:54.842830081 +0000 UTC m=+2.204820279 container died e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, architecture=x86_64, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, container_name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Dec 6 03:16:54 localhost systemd[1]: tmp-crun.pQBPvk.mount: Deactivated successfully. Dec 6 03:16:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475-userdata-shm.mount: Deactivated successfully. Dec 6 03:16:54 localhost podman[48894]: 2025-12-06 08:16:54.933209475 +0000 UTC m=+0.077883838 container cleanup e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=create_virtlogd_wrapper, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, distribution-scope=public) Dec 6 03:16:54 localhost systemd[1]: libpod-conmon-e2a6eeda6bfdd616cbf6ff6ce0632f5dd05c309764b1024268dbac76dad02475.scope: Deactivated successfully. Dec 6 03:16:54 localhost python3[48381]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765007760 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Dec 6 03:16:55 localhost systemd[1]: libpod-cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8.scope: Deactivated successfully. Dec 6 03:16:55 localhost systemd[1]: libpod-cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8.scope: Consumed 2.134s CPU time. Dec 6 03:16:55 localhost podman[48642]: 2025-12-06 08:16:55.815992868 +0000 UTC m=+3.187363365 container died cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=create_haproxy_wrapper, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 6 03:16:55 localhost podman[48932]: 2025-12-06 08:16:55.881454053 +0000 UTC m=+0.054953680 container cleanup cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Dec 6 03:16:55 localhost systemd[1]: libpod-conmon-cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8.scope: Deactivated successfully. Dec 6 03:16:55 localhost python3[48381]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Dec 6 03:16:55 localhost systemd[1]: var-lib-containers-storage-overlay-edc3ecbc12a44c3c63d7e19fa2e89c15d1af79c98f2973226367ec81eae57350-merged.mount: Deactivated successfully. Dec 6 03:16:55 localhost systemd[1]: var-lib-containers-storage-overlay-a9befc48c332654d0358552aae4c030fd23abbcc85f80b32354d85e375439ce4-merged.mount: Deactivated successfully. Dec 6 03:16:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8-userdata-shm.mount: Deactivated successfully. Dec 6 03:16:56 localhost python3[48986]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:58 localhost python3[49107]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005548798 step=2 update_config_hash_only=False Dec 6 03:16:58 localhost python3[49123]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:16:59 localhost python3[49139]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 6 03:17:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:17:22 localhost podman[49140]: 2025-12-06 08:17:22.555673137 +0000 UTC m=+0.081936135 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:17:22 localhost podman[49140]: 2025-12-06 08:17:22.748021593 +0000 UTC m=+0.274284641 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1) Dec 6 03:17:22 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:17:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:17:53 localhost podman[49170]: 2025-12-06 08:17:53.549716839 +0000 UTC m=+0.080189644 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr) Dec 6 03:17:53 localhost podman[49170]: 2025-12-06 08:17:53.755776474 +0000 UTC m=+0.286249289 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:17:53 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:18:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:18:24 localhost podman[49199]: 2025-12-06 08:18:24.539262409 +0000 UTC m=+0.075007285 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1) Dec 6 03:18:24 localhost podman[49199]: 2025-12-06 08:18:24.723226271 +0000 UTC m=+0.258971147 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step1, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}) Dec 6 03:18:24 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:18:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:18:55 localhost podman[49229]: 2025-12-06 08:18:55.519522997 +0000 UTC m=+0.058809198 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044) Dec 6 03:18:55 localhost podman[49229]: 2025-12-06 08:18:55.726470405 +0000 UTC m=+0.265756646 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, distribution-scope=public, version=17.1.12, config_id=tripleo_step1, tcib_managed=true) Dec 6 03:18:55 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:19:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:19:26 localhost podman[49258]: 2025-12-06 08:19:26.527381186 +0000 UTC m=+0.063072391 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12) Dec 6 03:19:26 localhost podman[49258]: 2025-12-06 08:19:26.721272774 +0000 UTC m=+0.256963979 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 03:19:26 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:19:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:19:57 localhost systemd[1]: tmp-crun.uAt2ID.mount: Deactivated successfully. Dec 6 03:19:57 localhost podman[49288]: 2025-12-06 08:19:57.537677664 +0000 UTC m=+0.074945722 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:19:57 localhost podman[49288]: 2025-12-06 08:19:57.721803161 +0000 UTC m=+0.259071269 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:19:57 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:20:02 localhost sshd[49315]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:20:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:20:28 localhost podman[49316]: 2025-12-06 08:20:28.543815582 +0000 UTC m=+0.078702481 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, container_name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Dec 6 03:20:28 localhost podman[49316]: 2025-12-06 08:20:28.741630228 +0000 UTC m=+0.276517087 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr) Dec 6 03:20:28 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:20:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:20:59 localhost podman[49345]: 2025-12-06 08:20:59.546704332 +0000 UTC m=+0.081099069 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:20:59 localhost podman[49345]: 2025-12-06 08:20:59.742267557 +0000 UTC m=+0.276662294 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4) Dec 6 03:20:59 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:21:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:21:30 localhost podman[49374]: 2025-12-06 08:21:30.536819765 +0000 UTC m=+0.070608588 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 6 03:21:30 localhost podman[49374]: 2025-12-06 08:21:30.728010296 +0000 UTC m=+0.261799099 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1) Dec 6 03:21:30 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:21:41 localhost python3[49450]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:21:41 localhost python3[49495]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009300.750644-128663-235730123469243/source _original_basename=tmpi0bkkc7b follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:21:42 localhost python3[49525]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:21:44 localhost ansible-async_wrapper.py[49697]: Invoked with 15477680567 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009303.5363028-128977-220753887446854/AnsiballZ_command.py _ Dec 6 03:21:44 localhost ansible-async_wrapper.py[49700]: Starting module and watcher Dec 6 03:21:44 localhost ansible-async_wrapper.py[49700]: Start watching 49701 (3600) Dec 6 03:21:44 localhost ansible-async_wrapper.py[49701]: Start module (49701) Dec 6 03:21:44 localhost ansible-async_wrapper.py[49697]: Return async_wrapper task started. Dec 6 03:21:44 localhost python3[49721]: ansible-ansible.legacy.async_status Invoked with jid=15477680567.49697 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:21:49 localhost ansible-async_wrapper.py[49700]: 49701 still running (3600) Dec 6 03:21:50 localhost puppet-user[49720]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:21:50 localhost puppet-user[49720]: (file: /etc/puppet/hiera.yaml) Dec 6 03:21:50 localhost puppet-user[49720]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:21:50 localhost puppet-user[49720]: (file & line not available) Dec 6 03:21:50 localhost puppet-user[49720]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:21:50 localhost puppet-user[49720]: (file & line not available) Dec 6 03:21:50 localhost puppet-user[49720]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 6 03:21:50 localhost puppet-user[49720]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 6 03:21:50 localhost puppet-user[49720]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.12 seconds Dec 6 03:21:50 localhost puppet-user[49720]: Notice: Applied catalog in 0.04 seconds Dec 6 03:21:50 localhost puppet-user[49720]: Application: Dec 6 03:21:50 localhost puppet-user[49720]: Initial environment: production Dec 6 03:21:50 localhost puppet-user[49720]: Converged environment: production Dec 6 03:21:50 localhost puppet-user[49720]: Run mode: user Dec 6 03:21:50 localhost puppet-user[49720]: Changes: Dec 6 03:21:50 localhost puppet-user[49720]: Events: Dec 6 03:21:50 localhost puppet-user[49720]: Resources: Dec 6 03:21:50 localhost puppet-user[49720]: Total: 10 Dec 6 03:21:50 localhost puppet-user[49720]: Time: Dec 6 03:21:50 localhost puppet-user[49720]: Schedule: 0.00 Dec 6 03:21:50 localhost puppet-user[49720]: File: 0.00 Dec 6 03:21:50 localhost puppet-user[49720]: Exec: 0.01 Dec 6 03:21:50 localhost puppet-user[49720]: Augeas: 0.01 Dec 6 03:21:50 localhost puppet-user[49720]: Transaction evaluation: 0.03 Dec 6 03:21:50 localhost puppet-user[49720]: Catalog application: 0.04 Dec 6 03:21:50 localhost puppet-user[49720]: Config retrieval: 0.16 Dec 6 03:21:50 localhost puppet-user[49720]: Last run: 1765009310 Dec 6 03:21:50 localhost puppet-user[49720]: Filebucket: 0.00 Dec 6 03:21:50 localhost puppet-user[49720]: Total: 0.05 Dec 6 03:21:50 localhost puppet-user[49720]: Version: Dec 6 03:21:50 localhost puppet-user[49720]: Config: 1765009310 Dec 6 03:21:50 localhost puppet-user[49720]: Puppet: 7.10.0 Dec 6 03:21:50 localhost ansible-async_wrapper.py[49701]: Module complete (49701) Dec 6 03:21:54 localhost ansible-async_wrapper.py[49700]: Done in kid B. Dec 6 03:21:54 localhost python3[49851]: ansible-ansible.legacy.async_status Invoked with jid=15477680567.49697 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:21:55 localhost python3[49867]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:21:55 localhost python3[49883]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:21:56 localhost python3[49933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:21:56 localhost python3[49951]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpoyg7b76n recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:21:57 localhost python3[49981]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:21:58 localhost python3[50084]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 6 03:21:59 localhost python3[50103]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:00 localhost python3[50136]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:00 localhost python3[50186]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:22:01 localhost podman[50204]: 2025-12-06 08:22:01.094328591 +0000 UTC m=+0.082796516 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20251118.1) Dec 6 03:22:01 localhost python3[50205]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:01 localhost podman[50204]: 2025-12-06 08:22:01.288064178 +0000 UTC m=+0.276532113 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Dec 6 03:22:01 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:22:01 localhost python3[50294]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:22:01 localhost python3[50312]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:02 localhost python3[50374]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:22:02 localhost python3[50392]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:03 localhost python3[50454]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:22:03 localhost python3[50472]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:04 localhost python3[50502]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:04 localhost systemd[1]: Reloading. Dec 6 03:22:04 localhost systemd-rc-local-generator[50528]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:04 localhost systemd-sysv-generator[50532]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:04 localhost python3[50588]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:22:05 localhost python3[50607]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:05 localhost python3[50669]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:22:06 localhost python3[50687]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:06 localhost python3[50717]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:06 localhost systemd[1]: Reloading. Dec 6 03:22:06 localhost systemd-sysv-generator[50744]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:06 localhost systemd-rc-local-generator[50741]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:06 localhost systemd[1]: Starting Create netns directory... Dec 6 03:22:06 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 03:22:06 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 03:22:06 localhost systemd[1]: Finished Create netns directory. Dec 6 03:22:07 localhost python3[50776]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 6 03:22:09 localhost python3[50833]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 6 03:22:09 localhost podman[50994]: 2025-12-06 08:22:09.722505501 +0000 UTC m=+0.064303115 container create 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:22:09 localhost podman[50982]: 2025-12-06 08:22:09.744615656 +0000 UTC m=+0.090533427 container create 09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 03:22:09 localhost podman[51006]: 2025-12-06 08:22:09.7615033 +0000 UTC m=+0.085180021 container create ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 03:22:09 localhost systemd[1]: Started libpod-conmon-01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.scope. Dec 6 03:22:09 localhost systemd[1]: Started libpod-conmon-09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc.scope. Dec 6 03:22:09 localhost podman[51004]: 2025-12-06 08:22:09.776549866 +0000 UTC m=+0.110732343 container create 8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_virtlogd_wrapper, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible) Dec 6 03:22:09 localhost systemd[1]: Started libcrun container. Dec 6 03:22:09 localhost podman[50994]: 2025-12-06 08:22:09.680274872 +0000 UTC m=+0.022072486 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 6 03:22:09 localhost systemd[1]: Started libcrun container. Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/913fb92e7d358376526afb98428ee303b126d5d41e5eaa0788a4d91b167c9322/merged/scripts supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49b8b70cca1b8688ffb76c76c36861788ac84dfc866a61a08c8fee4374da9beb/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/913fb92e7d358376526afb98428ee303b126d5d41e5eaa0788a4d91b167c9322/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49b8b70cca1b8688ffb76c76c36861788ac84dfc866a61a08c8fee4374da9beb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49b8b70cca1b8688ffb76c76c36861788ac84dfc866a61a08c8fee4374da9beb/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost systemd[1]: Started libpod-conmon-ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324.scope. Dec 6 03:22:09 localhost podman[51037]: 2025-12-06 08:22:09.796064061 +0000 UTC m=+0.095691597 container create 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, vcs-type=git, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:22:09 localhost systemd[1]: Started libcrun container. Dec 6 03:22:09 localhost podman[50982]: 2025-12-06 08:22:09.697566888 +0000 UTC m=+0.043484659 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:22:09 localhost podman[51004]: 2025-12-06 08:22:09.700672934 +0000 UTC m=+0.034855441 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ce8c2c866bc20ee02a745aae8c22be484c8b579141c525f40381f8b1c563bd1/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost podman[51006]: 2025-12-06 08:22:09.805324729 +0000 UTC m=+0.129001410 container init ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, distribution-scope=public) Dec 6 03:22:09 localhost podman[51006]: 2025-12-06 08:22:09.712357076 +0000 UTC m=+0.036033807 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 6 03:22:09 localhost systemd[1]: Started libpod-conmon-8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3.scope. Dec 6 03:22:09 localhost podman[51006]: 2025-12-06 08:22:09.816912008 +0000 UTC m=+0.140588689 container start ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Dec 6 03:22:09 localhost systemd[1]: libpod-ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324.scope: Deactivated successfully. Dec 6 03:22:09 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Dec 6 03:22:09 localhost systemd[1]: Started libcrun container. Dec 6 03:22:09 localhost systemd[1]: Started libpod-conmon-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope. Dec 6 03:22:09 localhost podman[51037]: 2025-12-06 08:22:09.738552109 +0000 UTC m=+0.038179635 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost podman[51004]: 2025-12-06 08:22:09.844816023 +0000 UTC m=+0.178998510 container init 8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt) Dec 6 03:22:09 localhost podman[51004]: 2025-12-06 08:22:09.853511463 +0000 UTC m=+0.187693950 container start 8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_virtlogd_wrapper, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt) Dec 6 03:22:09 localhost systemd[1]: Started libcrun container. Dec 6 03:22:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:09 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro --volume /etc/pki/libvirt:/etc/pki/libvirt:ro --volume /etc/pki/qemu:/etc/pki/qemu:ro --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:09 localhost podman[50994]: 2025-12-06 08:22:09.860919502 +0000 UTC m=+0.202717116 container init 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public) Dec 6 03:22:09 localhost podman[51037]: 2025-12-06 08:22:09.870629083 +0000 UTC m=+0.170256609 container init 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=rsyslog, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:22:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:22:09 localhost podman[50982]: 2025-12-06 08:22:09.890445328 +0000 UTC m=+0.236363069 container init 09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_statedir_owner, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:22:09 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:09 localhost podman[51037]: 2025-12-06 08:22:09.893879804 +0000 UTC m=+0.193507340 container start 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, container_name=rsyslog, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Dec 6 03:22:09 localhost systemd[1]: Created slice User Slice of UID 0. Dec 6 03:22:09 localhost podman[50994]: 2025-12-06 08:22:09.899985443 +0000 UTC m=+0.241783047 container start 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, distribution-scope=public, container_name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:22:09 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 6 03:22:09 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=356f6a60d2bd0022f54ca41a3c5253ab --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 6 03:22:09 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:09 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=5923e560c9d95c3eb077adacead52760 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 6 03:22:09 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 6 03:22:09 localhost podman[50982]: 2025-12-06 08:22:09.91342191 +0000 UTC m=+0.259339651 container start 09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 6 03:22:09 localhost podman[50982]: 2025-12-06 08:22:09.913568144 +0000 UTC m=+0.259485875 container attach 09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_statedir_owner, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:22:09 localhost systemd[1]: Starting User Manager for UID 0... Dec 6 03:22:09 localhost podman[51082]: 2025-12-06 08:22:09.943508203 +0000 UTC m=+0.111801657 container died ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_init_log, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:22:09 localhost systemd[1]: libpod-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope: Deactivated successfully. Dec 6 03:22:09 localhost systemd[1]: libpod-09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc.scope: Deactivated successfully. Dec 6 03:22:09 localhost podman[50982]: 2025-12-06 08:22:09.980782188 +0000 UTC m=+0.326699939 container died 09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_statedir_owner, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:22:10 localhost systemd[51137]: Queued start job for default target Main User Target. Dec 6 03:22:10 localhost systemd[51137]: Created slice User Application Slice. Dec 6 03:22:10 localhost systemd[51137]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 6 03:22:10 localhost systemd[51137]: Started Daily Cleanup of User's Temporary Directories. Dec 6 03:22:10 localhost systemd[51137]: Reached target Paths. Dec 6 03:22:10 localhost systemd[51137]: Reached target Timers. Dec 6 03:22:10 localhost systemd[51137]: Starting D-Bus User Message Bus Socket... Dec 6 03:22:10 localhost systemd[51137]: Starting Create User's Volatile Files and Directories... Dec 6 03:22:10 localhost systemd[51137]: Finished Create User's Volatile Files and Directories. Dec 6 03:22:10 localhost podman[51119]: 2025-12-06 08:22:10.042517272 +0000 UTC m=+0.135456730 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-type=git, managed_by=tripleo_ansible) Dec 6 03:22:10 localhost systemd[51137]: Listening on D-Bus User Message Bus Socket. Dec 6 03:22:10 localhost systemd[51137]: Reached target Sockets. Dec 6 03:22:10 localhost systemd[51137]: Reached target Basic System. Dec 6 03:22:10 localhost systemd[51137]: Reached target Main User Target. Dec 6 03:22:10 localhost systemd[51137]: Startup finished in 94ms. Dec 6 03:22:10 localhost systemd[1]: Started User Manager for UID 0. Dec 6 03:22:10 localhost podman[51190]: 2025-12-06 08:22:10.05790577 +0000 UTC m=+0.064540132 container cleanup 09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, container_name=nova_statedir_owner, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 03:22:10 localhost systemd[1]: Started Session c1 of User root. Dec 6 03:22:10 localhost systemd[1]: Started Session c2 of User root. Dec 6 03:22:10 localhost systemd[1]: libpod-conmon-09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc.scope: Deactivated successfully. Dec 6 03:22:10 localhost podman[51178]: 2025-12-06 08:22:10.082528803 +0000 UTC m=+0.099308760 container died 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=rsyslog, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=) Dec 6 03:22:10 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1765007760 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Dec 6 03:22:10 localhost systemd[1]: session-c1.scope: Deactivated successfully. Dec 6 03:22:10 localhost systemd[1]: session-c2.scope: Deactivated successfully. Dec 6 03:22:10 localhost podman[51119]: 2025-12-06 08:22:10.136459385 +0000 UTC m=+0.229398853 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-collectd, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12) Dec 6 03:22:10 localhost podman[51119]: unhealthy Dec 6 03:22:10 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:22:10 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Failed with result 'exit-code'. Dec 6 03:22:10 localhost podman[51081]: 2025-12-06 08:22:10.148608951 +0000 UTC m=+0.318780494 container cleanup ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step3, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ceilometer_init_log, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:22:10 localhost systemd[1]: libpod-conmon-ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324.scope: Deactivated successfully. Dec 6 03:22:10 localhost podman[51178]: 2025-12-06 08:22:10.155767784 +0000 UTC m=+0.172547771 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:22:10 localhost systemd[1]: libpod-conmon-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope: Deactivated successfully. Dec 6 03:22:10 localhost podman[51376]: 2025-12-06 08:22:10.508601523 +0000 UTC m=+0.083240172 container create 45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-type=git, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 03:22:10 localhost systemd[1]: Started libpod-conmon-45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196.scope. Dec 6 03:22:10 localhost podman[51390]: 2025-12-06 08:22:10.555200107 +0000 UTC m=+0.063427797 container create 91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}) Dec 6 03:22:10 localhost systemd[1]: Started libcrun container. Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0eb3924bf90abe79532c5d63ba0c64dcad199823811b739cedec44dad043767/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0eb3924bf90abe79532c5d63ba0c64dcad199823811b739cedec44dad043767/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0eb3924bf90abe79532c5d63ba0c64dcad199823811b739cedec44dad043767/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0eb3924bf90abe79532c5d63ba0c64dcad199823811b739cedec44dad043767/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost systemd[1]: Started libpod-conmon-91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9.scope. Dec 6 03:22:10 localhost podman[51376]: 2025-12-06 08:22:10.581498143 +0000 UTC m=+0.156136792 container init 45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:22:10 localhost podman[51376]: 2025-12-06 08:22:10.485588329 +0000 UTC m=+0.060227008 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:10 localhost podman[51376]: 2025-12-06 08:22:10.589897273 +0000 UTC m=+0.164535922 container start 45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:22:10 localhost systemd[1]: Started libcrun container. Dec 6 03:22:10 localhost podman[51390]: 2025-12-06 08:22:10.52366521 +0000 UTC m=+0.031892960 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:10 localhost podman[51390]: 2025-12-06 08:22:10.629683167 +0000 UTC m=+0.137910867 container init 91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, container_name=nova_virtsecretd, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Dec 6 03:22:10 localhost podman[51390]: 2025-12-06 08:22:10.639930105 +0000 UTC m=+0.148157795 container start 91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, version=17.1.12, architecture=x86_64) Dec 6 03:22:10 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro --volume /etc/pki/libvirt:/etc/pki/libvirt:ro --volume /etc/pki/qemu:/etc/pki/qemu:ro --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:10 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:10 localhost systemd[1]: Started Session c3 of User root. Dec 6 03:22:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324-userdata-shm.mount: Deactivated successfully. Dec 6 03:22:10 localhost systemd[1]: var-lib-containers-storage-overlay-49b8b70cca1b8688ffb76c76c36861788ac84dfc866a61a08c8fee4374da9beb-merged.mount: Deactivated successfully. Dec 6 03:22:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09ef40b56dc93d2bf1d926fc7e1857e8c5e4fe6be1f95baccc2c779ce470ebfc-userdata-shm.mount: Deactivated successfully. Dec 6 03:22:10 localhost systemd[1]: session-c3.scope: Deactivated successfully. Dec 6 03:22:11 localhost podman[51533]: 2025-12-06 08:22:11.132418192 +0000 UTC m=+0.091777465 container create 6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Dec 6 03:22:11 localhost podman[51543]: 2025-12-06 08:22:11.171086701 +0000 UTC m=+0.100517867 container create 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:22:11 localhost podman[51533]: 2025-12-06 08:22:11.085460767 +0000 UTC m=+0.044820030 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:11 localhost systemd[1]: Started libpod-conmon-6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7.scope. Dec 6 03:22:11 localhost systemd[1]: Started libpod-conmon-3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.scope. Dec 6 03:22:11 localhost systemd[1]: Started libcrun container. Dec 6 03:22:11 localhost systemd[1]: Started libcrun container. Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b0fc934b7e8ebe947ae2f5b8850ae682bc514513297f85ec62276c836d92b6b/merged/etc/target supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b0fc934b7e8ebe947ae2f5b8850ae682bc514513297f85ec62276c836d92b6b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost podman[51543]: 2025-12-06 08:22:11.126462618 +0000 UTC m=+0.055893814 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 6 03:22:11 localhost podman[51533]: 2025-12-06 08:22:11.230461142 +0000 UTC m=+0.189820405 container init 6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com) Dec 6 03:22:11 localhost podman[51533]: 2025-12-06 08:22:11.241790843 +0000 UTC m=+0.201150106 container start 6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, distribution-scope=public, container_name=nova_virtnodedevd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc.) Dec 6 03:22:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:22:11 localhost podman[51543]: 2025-12-06 08:22:11.247766819 +0000 UTC m=+0.177198015 container init 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, container_name=iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc.) Dec 6 03:22:11 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro --volume /etc/pki/libvirt:/etc/pki/libvirt:ro --volume /etc/pki/qemu:/etc/pki/qemu:ro --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:11 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:22:11 localhost podman[51543]: 2025-12-06 08:22:11.295798528 +0000 UTC m=+0.225229694 container start 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, container_name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vendor=Red Hat, Inc.) Dec 6 03:22:11 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4e7ada4fab3991cc27fb5f75a09b7e0f --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 6 03:22:11 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:11 localhost systemd[1]: Started Session c4 of User root. Dec 6 03:22:11 localhost systemd[1]: Started Session c5 of User root. Dec 6 03:22:11 localhost systemd[1]: session-c4.scope: Deactivated successfully. Dec 6 03:22:11 localhost systemd[1]: session-c5.scope: Deactivated successfully. Dec 6 03:22:11 localhost podman[51583]: 2025-12-06 08:22:11.407957546 +0000 UTC m=+0.099031752 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1) Dec 6 03:22:11 localhost kernel: Loading iSCSI transport class v2.0-870. Dec 6 03:22:11 localhost podman[51583]: 2025-12-06 08:22:11.493218839 +0000 UTC m=+0.184293065 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1) Dec 6 03:22:11 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:22:11 localhost podman[51714]: 2025-12-06 08:22:11.917016477 +0000 UTC m=+0.094313544 container create ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 03:22:11 localhost systemd[1]: Started libpod-conmon-ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7.scope. Dec 6 03:22:11 localhost podman[51714]: 2025-12-06 08:22:11.872976893 +0000 UTC m=+0.050274020 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:11 localhost systemd[1]: Started libcrun container. Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:11 localhost podman[51714]: 2025-12-06 08:22:11.998736502 +0000 UTC m=+0.176033589 container init ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, container_name=nova_virtstoraged, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt) Dec 6 03:22:12 localhost podman[51714]: 2025-12-06 08:22:12.009790614 +0000 UTC m=+0.187087701 container start ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtstoraged) Dec 6 03:22:12 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro --volume /etc/pki/libvirt:/etc/pki/libvirt:ro --volume /etc/pki/qemu:/etc/pki/qemu:ro --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:12 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:12 localhost systemd[1]: Started Session c6 of User root. Dec 6 03:22:12 localhost systemd[1]: session-c6.scope: Deactivated successfully. Dec 6 03:22:12 localhost podman[51818]: 2025-12-06 08:22:12.44510298 +0000 UTC m=+0.078404502 container create aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, container_name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Dec 6 03:22:12 localhost systemd[1]: Started libpod-conmon-aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b.scope. Dec 6 03:22:12 localhost systemd[1]: Started libcrun container. Dec 6 03:22:12 localhost podman[51818]: 2025-12-06 08:22:12.39867931 +0000 UTC m=+0.031980892 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost podman[51818]: 2025-12-06 08:22:12.508154174 +0000 UTC m=+0.141455666 container init aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12) Dec 6 03:22:12 localhost podman[51818]: 2025-12-06 08:22:12.520875479 +0000 UTC m=+0.154176971 container start aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 6 03:22:12 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro --volume /etc/pki/libvirt:/etc/pki/libvirt:ro --volume /etc/pki/qemu:/etc/pki/qemu:ro --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:12 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:12 localhost systemd[1]: Started Session c7 of User root. Dec 6 03:22:12 localhost systemd[1]: session-c7.scope: Deactivated successfully. Dec 6 03:22:12 localhost podman[51921]: 2025-12-06 08:22:12.909917571 +0000 UTC m=+0.062052615 container create 15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, version=17.1.12, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, container_name=nova_virtproxyd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 03:22:12 localhost systemd[1]: Started libpod-conmon-15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c.scope. Dec 6 03:22:12 localhost systemd[1]: tmp-crun.6M7Eks.mount: Deactivated successfully. Dec 6 03:22:12 localhost systemd[1]: Started libcrun container. Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:12 localhost podman[51921]: 2025-12-06 08:22:12.974161303 +0000 UTC m=+0.126296347 container init 15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_virtproxyd, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:22:12 localhost podman[51921]: 2025-12-06 08:22:12.878164536 +0000 UTC m=+0.030299590 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:12 localhost podman[51921]: 2025-12-06 08:22:12.983184542 +0000 UTC m=+0.135319586 container start 15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, container_name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public) Dec 6 03:22:12 localhost python3[50833]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro --volume /etc/pki/libvirt:/etc/pki/libvirt:ro --volume /etc/pki/qemu:/etc/pki/qemu:ro --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 6 03:22:13 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:22:13 localhost systemd[1]: Started Session c8 of User root. Dec 6 03:22:13 localhost systemd[1]: session-c8.scope: Deactivated successfully. Dec 6 03:22:13 localhost python3[52004]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:13 localhost python3[52020]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:14 localhost python3[52036]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:14 localhost python3[52052]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:14 localhost python3[52068]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:14 localhost python3[52084]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:15 localhost python3[52100]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:15 localhost python3[52116]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:15 localhost python3[52132]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:16 localhost python3[52148]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:16 localhost python3[52165]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:16 localhost python3[52181]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:16 localhost python3[52197]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:17 localhost python3[52213]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:17 localhost python3[52229]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:17 localhost python3[52245]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:17 localhost python3[52261]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:18 localhost python3[52277]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:22:18 localhost python3[52338]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:19 localhost python3[52367]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:19 localhost python3[52396]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:20 localhost python3[52425]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:20 localhost python3[52454]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:21 localhost python3[52483]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:21 localhost python3[52512]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:22 localhost python3[52541]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:23 localhost python3[52570]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009338.1933966-130901-104629685224710/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:23 localhost systemd[1]: Stopping User Manager for UID 0... Dec 6 03:22:23 localhost systemd[51137]: Activating special unit Exit the Session... Dec 6 03:22:23 localhost systemd[51137]: Stopped target Main User Target. Dec 6 03:22:23 localhost systemd[51137]: Stopped target Basic System. Dec 6 03:22:23 localhost systemd[51137]: Stopped target Paths. Dec 6 03:22:23 localhost systemd[51137]: Stopped target Sockets. Dec 6 03:22:23 localhost systemd[51137]: Stopped target Timers. Dec 6 03:22:23 localhost systemd[51137]: Stopped Daily Cleanup of User's Temporary Directories. Dec 6 03:22:23 localhost systemd[51137]: Closed D-Bus User Message Bus Socket. Dec 6 03:22:23 localhost systemd[51137]: Stopped Create User's Volatile Files and Directories. Dec 6 03:22:23 localhost systemd[51137]: Removed slice User Application Slice. Dec 6 03:22:23 localhost systemd[51137]: Reached target Shutdown. Dec 6 03:22:23 localhost systemd[51137]: Finished Exit the Session. Dec 6 03:22:23 localhost systemd[51137]: Reached target Exit the Session. Dec 6 03:22:23 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 6 03:22:23 localhost systemd[1]: Stopped User Manager for UID 0. Dec 6 03:22:23 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 6 03:22:23 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 6 03:22:23 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 6 03:22:23 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 6 03:22:23 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 6 03:22:23 localhost python3[52586]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 03:22:23 localhost systemd[1]: Reloading. Dec 6 03:22:23 localhost systemd-rc-local-generator[52613]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:23 localhost systemd-sysv-generator[52616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:24 localhost python3[52640]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:24 localhost systemd[1]: Reloading. Dec 6 03:22:24 localhost systemd-rc-local-generator[52667]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:24 localhost systemd-sysv-generator[52670]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:24 localhost systemd[1]: Starting collectd container... Dec 6 03:22:24 localhost systemd[1]: Started collectd container. Dec 6 03:22:25 localhost python3[52706]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:25 localhost systemd[1]: Reloading. Dec 6 03:22:25 localhost systemd-rc-local-generator[52734]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:25 localhost systemd-sysv-generator[52739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:25 localhost systemd[1]: Starting iscsid container... Dec 6 03:22:25 localhost systemd[1]: Started iscsid container. Dec 6 03:22:26 localhost python3[52772]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:26 localhost systemd[1]: Reloading. Dec 6 03:22:26 localhost systemd-sysv-generator[52804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:26 localhost systemd-rc-local-generator[52798]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:26 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Dec 6 03:22:26 localhost systemd[1]: Started nova_virtlogd_wrapper container. Dec 6 03:22:27 localhost python3[52839]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:28 localhost systemd[1]: Reloading. Dec 6 03:22:28 localhost systemd-rc-local-generator[52863]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:28 localhost systemd-sysv-generator[52867]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:28 localhost systemd[1]: Starting nova_virtnodedevd container... Dec 6 03:22:28 localhost tripleo-start-podman-container[52878]: Creating additional drop-in dependency for "nova_virtnodedevd" (6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7) Dec 6 03:22:28 localhost systemd[1]: Reloading. Dec 6 03:22:28 localhost systemd-rc-local-generator[52935]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:28 localhost systemd-sysv-generator[52938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:28 localhost systemd[1]: Started nova_virtnodedevd container. Dec 6 03:22:29 localhost python3[52961]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:29 localhost systemd[1]: Reloading. Dec 6 03:22:29 localhost systemd-rc-local-generator[52988]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:29 localhost systemd-sysv-generator[52993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:29 localhost systemd[1]: Starting nova_virtproxyd container... Dec 6 03:22:30 localhost tripleo-start-podman-container[53001]: Creating additional drop-in dependency for "nova_virtproxyd" (15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c) Dec 6 03:22:30 localhost systemd[1]: Reloading. Dec 6 03:22:30 localhost systemd-rc-local-generator[53059]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:30 localhost systemd-sysv-generator[53062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:30 localhost systemd[1]: Started nova_virtproxyd container. Dec 6 03:22:30 localhost python3[53086]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:30 localhost systemd[1]: Reloading. Dec 6 03:22:31 localhost systemd-rc-local-generator[53110]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:31 localhost systemd-sysv-generator[53114]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:22:31 localhost systemd[1]: Starting nova_virtqemud container... Dec 6 03:22:31 localhost tripleo-start-podman-container[53127]: Creating additional drop-in dependency for "nova_virtqemud" (aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b) Dec 6 03:22:31 localhost systemd[1]: Reloading. Dec 6 03:22:31 localhost podman[53126]: 2025-12-06 08:22:31.748708272 +0000 UTC m=+0.187284408 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr) Dec 6 03:22:31 localhost systemd-rc-local-generator[53207]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:31 localhost systemd-sysv-generator[53212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:31 localhost systemd[1]: Started nova_virtqemud container. Dec 6 03:22:31 localhost podman[53126]: 2025-12-06 08:22:31.993574843 +0000 UTC m=+0.432150979 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step1) Dec 6 03:22:32 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:22:32 localhost python3[53236]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:32 localhost systemd[1]: Reloading. Dec 6 03:22:32 localhost systemd-sysv-generator[53263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:32 localhost systemd-rc-local-generator[53260]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:32 localhost systemd[1]: Starting nova_virtsecretd container... Dec 6 03:22:33 localhost tripleo-start-podman-container[53275]: Creating additional drop-in dependency for "nova_virtsecretd" (91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9) Dec 6 03:22:33 localhost systemd[1]: Reloading. Dec 6 03:22:33 localhost systemd-sysv-generator[53332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:33 localhost systemd-rc-local-generator[53327]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:33 localhost systemd[1]: Started nova_virtsecretd container. Dec 6 03:22:33 localhost python3[53359]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:34 localhost systemd[1]: Reloading. Dec 6 03:22:34 localhost systemd-rc-local-generator[53386]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:34 localhost systemd-sysv-generator[53389]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:34 localhost systemd[1]: Starting nova_virtstoraged container... Dec 6 03:22:34 localhost tripleo-start-podman-container[53399]: Creating additional drop-in dependency for "nova_virtstoraged" (ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7) Dec 6 03:22:34 localhost systemd[1]: Reloading. Dec 6 03:22:34 localhost systemd-rc-local-generator[53457]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:34 localhost systemd-sysv-generator[53460]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:34 localhost systemd[1]: Started nova_virtstoraged container. Dec 6 03:22:35 localhost python3[53484]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:22:35 localhost systemd[1]: Reloading. Dec 6 03:22:35 localhost systemd-rc-local-generator[53511]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:22:35 localhost systemd-sysv-generator[53517]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:22:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:22:35 localhost systemd[1]: Starting rsyslog container... Dec 6 03:22:35 localhost systemd[1]: Started libcrun container. Dec 6 03:22:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:35 localhost podman[53524]: 2025-12-06 08:22:35.803124701 +0000 UTC m=+0.129644700 container init 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, architecture=x86_64) Dec 6 03:22:35 localhost podman[53524]: 2025-12-06 08:22:35.81375602 +0000 UTC m=+0.140276029 container start 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:22:35 localhost podman[53524]: rsyslog Dec 6 03:22:35 localhost systemd[1]: Started rsyslog container. Dec 6 03:22:35 localhost systemd[1]: libpod-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope: Deactivated successfully. Dec 6 03:22:35 localhost podman[53557]: 2025-12-06 08:22:35.990248702 +0000 UTC m=+0.064416858 container died 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 6 03:22:36 localhost podman[53557]: 2025-12-06 08:22:36.016411203 +0000 UTC m=+0.090579309 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, distribution-scope=public) Dec 6 03:22:36 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:22:36 localhost podman[53573]: 2025-12-06 08:22:36.077939381 +0000 UTC m=+0.039448824 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Dec 6 03:22:36 localhost podman[53573]: rsyslog Dec 6 03:22:36 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 6 03:22:36 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Dec 6 03:22:36 localhost systemd[1]: Stopped rsyslog container. Dec 6 03:22:36 localhost systemd[1]: Starting rsyslog container... Dec 6 03:22:36 localhost python3[53600]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:36 localhost systemd[1]: Started libcrun container. Dec 6 03:22:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:36 localhost podman[53601]: 2025-12-06 08:22:36.328484508 +0000 UTC m=+0.114127279 container init 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible) Dec 6 03:22:36 localhost podman[53601]: 2025-12-06 08:22:36.334355991 +0000 UTC m=+0.119998752 container start 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:49Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog) Dec 6 03:22:36 localhost podman[53601]: rsyslog Dec 6 03:22:36 localhost systemd[1]: Started rsyslog container. Dec 6 03:22:36 localhost systemd[1]: libpod-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope: Deactivated successfully. Dec 6 03:22:36 localhost podman[53624]: 2025-12-06 08:22:36.488637574 +0000 UTC m=+0.052583862 container died 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, version=17.1.12, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Dec 6 03:22:36 localhost podman[53624]: 2025-12-06 08:22:36.538099877 +0000 UTC m=+0.102046145 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog) Dec 6 03:22:36 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:22:36 localhost podman[53638]: 2025-12-06 08:22:36.616184288 +0000 UTC m=+0.049762594 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com) Dec 6 03:22:36 localhost podman[53638]: rsyslog Dec 6 03:22:36 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 6 03:22:36 localhost systemd[1]: var-lib-containers-storage-overlay-e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897-merged.mount: Deactivated successfully. Dec 6 03:22:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca-userdata-shm.mount: Deactivated successfully. Dec 6 03:22:36 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Dec 6 03:22:36 localhost systemd[1]: Stopped rsyslog container. Dec 6 03:22:36 localhost systemd[1]: Starting rsyslog container... Dec 6 03:22:36 localhost systemd[1]: Started libcrun container. Dec 6 03:22:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:36 localhost podman[53679]: 2025-12-06 08:22:36.887731947 +0000 UTC m=+0.124409778 container init 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, com.redhat.component=openstack-rsyslog-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Dec 6 03:22:36 localhost podman[53679]: 2025-12-06 08:22:36.900053478 +0000 UTC m=+0.136731319 container start 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc.) Dec 6 03:22:36 localhost podman[53679]: rsyslog Dec 6 03:22:36 localhost systemd[1]: Started rsyslog container. Dec 6 03:22:36 localhost systemd[1]: libpod-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope: Deactivated successfully. Dec 6 03:22:37 localhost podman[53717]: 2025-12-06 08:22:37.066502779 +0000 UTC m=+0.056530653 container died 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Dec 6 03:22:37 localhost podman[53717]: 2025-12-06 08:22:37.095297412 +0000 UTC m=+0.085325256 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog) Dec 6 03:22:37 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:22:37 localhost podman[53745]: 2025-12-06 08:22:37.199314577 +0000 UTC m=+0.061279392 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, tcib_managed=true, container_name=rsyslog, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible) Dec 6 03:22:37 localhost podman[53745]: rsyslog Dec 6 03:22:37 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 6 03:22:37 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Dec 6 03:22:37 localhost systemd[1]: Stopped rsyslog container. Dec 6 03:22:37 localhost systemd[1]: Starting rsyslog container... Dec 6 03:22:37 localhost systemd[1]: Started libcrun container. Dec 6 03:22:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:37 localhost podman[53788]: 2025-12-06 08:22:37.590644309 +0000 UTC m=+0.125572724 container init 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, url=https://www.redhat.com, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Dec 6 03:22:37 localhost podman[53788]: 2025-12-06 08:22:37.602201767 +0000 UTC m=+0.137130192 container start 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Dec 6 03:22:37 localhost podman[53788]: rsyslog Dec 6 03:22:37 localhost systemd[1]: Started rsyslog container. Dec 6 03:22:37 localhost systemd[1]: libpod-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope: Deactivated successfully. Dec 6 03:22:37 localhost podman[53829]: 2025-12-06 08:22:37.781553878 +0000 UTC m=+0.056747700 container died 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, vcs-type=git, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:22:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca-userdata-shm.mount: Deactivated successfully. Dec 6 03:22:37 localhost systemd[1]: var-lib-containers-storage-overlay-e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897-merged.mount: Deactivated successfully. Dec 6 03:22:37 localhost podman[53829]: 2025-12-06 08:22:37.809131603 +0000 UTC m=+0.084325385 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1) Dec 6 03:22:37 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:22:37 localhost podman[53854]: 2025-12-06 08:22:37.894328314 +0000 UTC m=+0.052510488 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-rsyslog, version=17.1.12) Dec 6 03:22:37 localhost podman[53854]: rsyslog Dec 6 03:22:37 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 6 03:22:37 localhost python3[53852]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005548798 step=3 update_config_hash_only=False Dec 6 03:22:38 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Dec 6 03:22:38 localhost systemd[1]: Stopped rsyslog container. Dec 6 03:22:38 localhost systemd[1]: Starting rsyslog container... Dec 6 03:22:38 localhost systemd[1]: Started libcrun container. Dec 6 03:22:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 6 03:22:38 localhost podman[53866]: 2025-12-06 08:22:38.353443148 +0000 UTC m=+0.125420609 container init 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Dec 6 03:22:38 localhost podman[53866]: 2025-12-06 08:22:38.370171167 +0000 UTC m=+0.142148618 container start 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, architecture=x86_64, container_name=rsyslog, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:22:38 localhost podman[53866]: rsyslog Dec 6 03:22:38 localhost systemd[1]: Started rsyslog container. Dec 6 03:22:38 localhost python3[53894]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:22:38 localhost systemd[1]: libpod-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca.scope: Deactivated successfully. Dec 6 03:22:38 localhost podman[53904]: 2025-12-06 08:22:38.554829142 +0000 UTC m=+0.063875762 container died 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, distribution-scope=public, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Dec 6 03:22:38 localhost podman[53904]: 2025-12-06 08:22:38.578588059 +0000 UTC m=+0.087634619 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog) Dec 6 03:22:38 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:22:38 localhost podman[53918]: 2025-12-06 08:22:38.694720249 +0000 UTC m=+0.077436272 container cleanup 614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, vcs-type=git, release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '356f6a60d2bd0022f54ca41a3c5253ab'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, architecture=x86_64) Dec 6 03:22:38 localhost podman[53918]: rsyslog Dec 6 03:22:38 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 6 03:22:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-614f24f72a8a48319fe912c643cfd9b547d760bf6734ca3d54d51c62104e25ca-userdata-shm.mount: Deactivated successfully. Dec 6 03:22:38 localhost python3[53946]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 6 03:22:38 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Dec 6 03:22:38 localhost systemd[1]: Stopped rsyslog container. Dec 6 03:22:38 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Dec 6 03:22:38 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 6 03:22:38 localhost systemd[1]: Failed to start rsyslog container. Dec 6 03:22:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:22:40 localhost podman[53947]: 2025-12-06 08:22:40.563882308 +0000 UTC m=+0.096364738 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, batch=17.1_20251118.1, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T22:51:28Z, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:22:40 localhost podman[53947]: 2025-12-06 08:22:40.578185841 +0000 UTC m=+0.110668271 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, tcib_managed=true) Dec 6 03:22:40 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:22:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:22:42 localhost podman[53969]: 2025-12-06 08:22:42.549001074 +0000 UTC m=+0.083239321 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true) Dec 6 03:22:42 localhost podman[53969]: 2025-12-06 08:22:42.588465767 +0000 UTC m=+0.122704004 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-iscsid-container) Dec 6 03:22:42 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:23:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:23:02 localhost podman[53988]: 2025-12-06 08:23:02.536607646 +0000 UTC m=+0.071660789 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp17/openstack-qdrouterd, version=17.1.12, tcib_managed=true, config_id=tripleo_step1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:23:02 localhost podman[53988]: 2025-12-06 08:23:02.752297583 +0000 UTC m=+0.287350666 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=) Dec 6 03:23:02 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:23:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:23:11 localhost podman[54016]: 2025-12-06 08:23:11.553504655 +0000 UTC m=+0.088161295 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, release=1761123044, container_name=collectd, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:23:11 localhost podman[54016]: 2025-12-06 08:23:11.562111389 +0000 UTC m=+0.096768019 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Dec 6 03:23:11 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:23:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:23:13 localhost podman[54035]: 2025-12-06 08:23:13.544533972 +0000 UTC m=+0.079864670 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12) Dec 6 03:23:13 localhost podman[54035]: 2025-12-06 08:23:13.55423927 +0000 UTC m=+0.089570028 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-iscsid, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:23:13 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:23:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:23:33 localhost podman[54055]: 2025-12-06 08:23:33.542819163 +0000 UTC m=+0.078993384 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:23:33 localhost podman[54055]: 2025-12-06 08:23:33.762152031 +0000 UTC m=+0.298326252 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, tcib_managed=true, config_id=tripleo_step1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:23:33 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:23:42 localhost podman[54084]: 2025-12-06 08:23:42.540727888 +0000 UTC m=+0.076482896 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:23:42 localhost podman[54084]: 2025-12-06 08:23:42.579128267 +0000 UTC m=+0.114883235 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public) Dec 6 03:23:42 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:23:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:23:44 localhost systemd[1]: tmp-crun.RFZQZc.mount: Deactivated successfully. Dec 6 03:23:44 localhost podman[54104]: 2025-12-06 08:23:44.548033295 +0000 UTC m=+0.083381818 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:23:44 localhost podman[54104]: 2025-12-06 08:23:44.560283692 +0000 UTC m=+0.095632175 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Dec 6 03:23:44 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:24:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:24:04 localhost podman[54123]: 2025-12-06 08:24:04.548526972 +0000 UTC m=+0.079403156 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Dec 6 03:24:04 localhost podman[54123]: 2025-12-06 08:24:04.737291933 +0000 UTC m=+0.268168127 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, container_name=metrics_qdr, architecture=x86_64) Dec 6 03:24:04 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:24:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:24:13 localhost podman[54153]: 2025-12-06 08:24:13.54717538 +0000 UTC m=+0.082757589 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Dec 6 03:24:13 localhost podman[54153]: 2025-12-06 08:24:13.561294743 +0000 UTC m=+0.096876952 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 6 03:24:13 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:24:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:24:15 localhost podman[54173]: 2025-12-06 08:24:15.55096001 +0000 UTC m=+0.081622015 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible) Dec 6 03:24:15 localhost podman[54173]: 2025-12-06 08:24:15.563273828 +0000 UTC m=+0.093935893 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Dec 6 03:24:15 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:24:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:24:35 localhost podman[54192]: 2025-12-06 08:24:35.534533547 +0000 UTC m=+0.069142983 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vcs-type=git, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:24:35 localhost podman[54192]: 2025-12-06 08:24:35.726480474 +0000 UTC m=+0.261089890 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 6 03:24:35 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:24:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:24:44 localhost podman[54220]: 2025-12-06 08:24:44.545142692 +0000 UTC m=+0.080229601 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:24:44 localhost podman[54220]: 2025-12-06 08:24:44.579927479 +0000 UTC m=+0.115014408 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public) Dec 6 03:24:44 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:24:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:24:46 localhost podman[54240]: 2025-12-06 08:24:46.548554259 +0000 UTC m=+0.079610433 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Dec 6 03:24:46 localhost podman[54240]: 2025-12-06 08:24:46.58442734 +0000 UTC m=+0.115483514 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container) Dec 6 03:24:46 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:25:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:25:06 localhost systemd[1]: tmp-crun.P4eDep.mount: Deactivated successfully. Dec 6 03:25:06 localhost podman[54259]: 2025-12-06 08:25:06.560042137 +0000 UTC m=+0.095952685 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd) Dec 6 03:25:06 localhost podman[54259]: 2025-12-06 08:25:06.758393032 +0000 UTC m=+0.294303580 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team) Dec 6 03:25:06 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:25:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:25:15 localhost podman[54289]: 2025-12-06 08:25:15.54193436 +0000 UTC m=+0.077449987 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12) Dec 6 03:25:15 localhost podman[54289]: 2025-12-06 08:25:15.549879584 +0000 UTC m=+0.085395391 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:25:15 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:25:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:25:17 localhost systemd[1]: tmp-crun.pNEPxt.mount: Deactivated successfully. Dec 6 03:25:17 localhost podman[54310]: 2025-12-06 08:25:17.545559154 +0000 UTC m=+0.079112887 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1) Dec 6 03:25:17 localhost podman[54310]: 2025-12-06 08:25:17.581417325 +0000 UTC m=+0.114971028 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:25:17 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:25:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:25:37 localhost systemd[1]: tmp-crun.PsBcKN.mount: Deactivated successfully. Dec 6 03:25:37 localhost podman[54329]: 2025-12-06 08:25:37.535901019 +0000 UTC m=+0.071526195 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:25:37 localhost podman[54329]: 2025-12-06 08:25:37.708753992 +0000 UTC m=+0.244379228 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Dec 6 03:25:37 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:25:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:25:46 localhost systemd[1]: tmp-crun.5DmJBp.mount: Deactivated successfully. Dec 6 03:25:46 localhost podman[54358]: 2025-12-06 08:25:46.537534009 +0000 UTC m=+0.067866482 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:25:46 localhost podman[54358]: 2025-12-06 08:25:46.547259848 +0000 UTC m=+0.077592291 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, release=1761123044, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:25:46 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:25:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:25:48 localhost podman[54377]: 2025-12-06 08:25:48.547444937 +0000 UTC m=+0.078625723 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 6 03:25:48 localhost podman[54377]: 2025-12-06 08:25:48.554441981 +0000 UTC m=+0.085622747 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:25:48 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:26:01 localhost sshd[54396]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:26:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:26:08 localhost podman[54401]: 2025-12-06 08:26:08.551345418 +0000 UTC m=+0.077409286 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, container_name=metrics_qdr, architecture=x86_64) Dec 6 03:26:08 localhost podman[54401]: 2025-12-06 08:26:08.740751698 +0000 UTC m=+0.266815516 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:26:08 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:26:14 localhost python3[54479]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:26:15 localhost python3[54524]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009574.3261008-138085-119726776526987/source _original_basename=tmp94pm2hqn follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:26:15 localhost python3[54586]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:26:16 localhost python3[54629]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009575.5284739-138283-166918185907970/source _original_basename=tmpiht17zmk follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:26:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:26:16 localhost podman[54692]: 2025-12-06 08:26:16.798143902 +0000 UTC m=+0.087433963 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Dec 6 03:26:16 localhost podman[54692]: 2025-12-06 08:26:16.812178462 +0000 UTC m=+0.101468543 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Dec 6 03:26:16 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:26:16 localhost python3[54691]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:26:17 localhost python3[54754]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009576.4979072-138387-137415027289840/source _original_basename=tmp1w65eulj follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:26:17 localhost python3[54816]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:26:18 localhost python3[54859]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009577.5347884-138497-253361343015799/source _original_basename=tmpjiopl7mw follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:26:18 localhost python3[54889]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 6 03:26:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:26:18 localhost systemd[1]: Reloading. Dec 6 03:26:18 localhost systemd-sysv-generator[54925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:18 localhost systemd-rc-local-generator[54921]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:18 localhost podman[54891]: 2025-12-06 08:26:18.92217462 +0000 UTC m=+0.075693073 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12) Dec 6 03:26:18 localhost podman[54891]: 2025-12-06 08:26:18.958379841 +0000 UTC m=+0.111898304 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:26:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:19 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:26:19 localhost systemd[1]: Reloading. Dec 6 03:26:19 localhost systemd-sysv-generator[54973]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:19 localhost systemd-rc-local-generator[54968]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:19 localhost python3[54998]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:26:19 localhost systemd[1]: Reloading. Dec 6 03:26:19 localhost systemd-rc-local-generator[55019]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:19 localhost systemd-sysv-generator[55022]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:20 localhost systemd[1]: Reloading. Dec 6 03:26:20 localhost systemd-rc-local-generator[55061]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:20 localhost systemd-sysv-generator[55066]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:20 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Dec 6 03:26:20 localhost python3[55090]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:26:20 localhost systemd[1]: Reloading. Dec 6 03:26:20 localhost systemd-sysv-generator[55117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:20 localhost systemd-rc-local-generator[55114]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:21 localhost python3[55174]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:26:22 localhost python3[55217]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009581.2343566-138726-21942132034442/source _original_basename=tmp_952sucp follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:26:22 localhost python3[55247]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:26:22 localhost systemd[1]: Reloading. Dec 6 03:26:22 localhost systemd-sysv-generator[55275]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:22 localhost systemd-rc-local-generator[55272]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:22 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Dec 6 03:26:23 localhost python3[55301]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:26:24 localhost ansible-async_wrapper.py[55473]: Invoked with 150027821643 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009584.2380853-138920-73645315325226/AnsiballZ_command.py _ Dec 6 03:26:24 localhost ansible-async_wrapper.py[55476]: Starting module and watcher Dec 6 03:26:24 localhost ansible-async_wrapper.py[55476]: Start watching 55477 (3600) Dec 6 03:26:24 localhost ansible-async_wrapper.py[55477]: Start module (55477) Dec 6 03:26:24 localhost ansible-async_wrapper.py[55473]: Return async_wrapper task started. Dec 6 03:26:25 localhost python3[55494]: ansible-ansible.legacy.async_status Invoked with jid=150027821643.55473 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:26:29 localhost ansible-async_wrapper.py[55476]: 55477 still running (3600) Dec 6 03:26:30 localhost puppet-user[55497]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:26:30 localhost puppet-user[55497]: (file: /etc/puppet/hiera.yaml) Dec 6 03:26:30 localhost puppet-user[55497]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:26:30 localhost puppet-user[55497]: (file & line not available) Dec 6 03:26:30 localhost puppet-user[55497]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:26:30 localhost puppet-user[55497]: (file & line not available) Dec 6 03:26:30 localhost puppet-user[55497]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 6 03:26:30 localhost puppet-user[55497]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:26:30 localhost puppet-user[55497]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:26:30 localhost puppet-user[55497]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:26:30 localhost puppet-user[55497]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:26:30 localhost puppet-user[55497]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:26:30 localhost puppet-user[55497]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:26:30 localhost puppet-user[55497]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:26:30 localhost puppet-user[55497]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:26:30 localhost puppet-user[55497]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:26:30 localhost puppet-user[55497]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:26:30 localhost puppet-user[55497]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:26:30 localhost puppet-user[55497]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:26:30 localhost puppet-user[55497]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:26:30 localhost puppet-user[55497]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:26:30 localhost puppet-user[55497]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:26:30 localhost puppet-user[55497]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:26:30 localhost puppet-user[55497]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:26:30 localhost puppet-user[55497]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 6 03:26:30 localhost puppet-user[55497]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.20 seconds Dec 6 03:26:34 localhost ansible-async_wrapper.py[55476]: 55477 still running (3595) Dec 6 03:26:35 localhost python3[55637]: ansible-ansible.legacy.async_status Invoked with jid=150027821643.55473 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:26:38 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:26:38 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 03:26:38 localhost systemd[1]: Reloading. Dec 6 03:26:38 localhost systemd-rc-local-generator[55779]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:38 localhost systemd-sysv-generator[55785]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:26:38 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 03:26:39 localhost podman[56133]: 2025-12-06 08:26:39.109027381 +0000 UTC m=+0.207343702 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.) Dec 6 03:26:39 localhost podman[56133]: 2025-12-06 08:26:39.296236634 +0000 UTC m=+0.394552935 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:26:39 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:26:39 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 03:26:39 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 03:26:39 localhost systemd[1]: run-rf409697737a646cc826af0492039bce3.service: Deactivated successfully. Dec 6 03:26:39 localhost ansible-async_wrapper.py[55476]: 55477 still running (3590) Dec 6 03:26:40 localhost puppet-user[55497]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Dec 6 03:26:40 localhost puppet-user[55497]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}afaf3754204a3bebe65c0ad4174fa85e7c1db8a7f0d709fcbaf3e8d03f792a01' Dec 6 03:26:40 localhost puppet-user[55497]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Dec 6 03:26:40 localhost puppet-user[55497]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Dec 6 03:26:40 localhost puppet-user[55497]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Dec 6 03:26:40 localhost puppet-user[55497]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Dec 6 03:26:44 localhost ansible-async_wrapper.py[55476]: 55477 still running (3585) Dec 6 03:26:45 localhost puppet-user[55497]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Dec 6 03:26:45 localhost systemd[1]: Reloading. Dec 6 03:26:45 localhost systemd-sysv-generator[56872]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:45 localhost systemd-rc-local-generator[56868]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:45 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Dec 6 03:26:45 localhost snmpd[56894]: Can't find directory of RPM packages Dec 6 03:26:45 localhost snmpd[56894]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Dec 6 03:26:45 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Dec 6 03:26:45 localhost python3[56893]: ansible-ansible.legacy.async_status Invoked with jid=150027821643.55473 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:26:45 localhost systemd[1]: Reloading. Dec 6 03:26:46 localhost systemd-rc-local-generator[56920]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:46 localhost systemd-sysv-generator[56924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:46 localhost systemd[1]: Reloading. Dec 6 03:26:46 localhost systemd-rc-local-generator[56956]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:26:46 localhost systemd-sysv-generator[56959]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:26:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:26:46 localhost puppet-user[55497]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Dec 6 03:26:46 localhost puppet-user[55497]: Notice: Applied catalog in 15.86 seconds Dec 6 03:26:46 localhost puppet-user[55497]: Application: Dec 6 03:26:46 localhost puppet-user[55497]: Initial environment: production Dec 6 03:26:46 localhost puppet-user[55497]: Converged environment: production Dec 6 03:26:46 localhost puppet-user[55497]: Run mode: user Dec 6 03:26:46 localhost puppet-user[55497]: Changes: Dec 6 03:26:46 localhost puppet-user[55497]: Total: 8 Dec 6 03:26:46 localhost puppet-user[55497]: Events: Dec 6 03:26:46 localhost puppet-user[55497]: Success: 8 Dec 6 03:26:46 localhost puppet-user[55497]: Total: 8 Dec 6 03:26:46 localhost puppet-user[55497]: Resources: Dec 6 03:26:46 localhost puppet-user[55497]: Restarted: 1 Dec 6 03:26:46 localhost puppet-user[55497]: Changed: 8 Dec 6 03:26:46 localhost puppet-user[55497]: Out of sync: 8 Dec 6 03:26:46 localhost puppet-user[55497]: Total: 19 Dec 6 03:26:46 localhost puppet-user[55497]: Time: Dec 6 03:26:46 localhost puppet-user[55497]: Schedule: 0.00 Dec 6 03:26:46 localhost puppet-user[55497]: Filebucket: 0.00 Dec 6 03:26:46 localhost puppet-user[55497]: Augeas: 0.01 Dec 6 03:26:46 localhost puppet-user[55497]: File: 0.10 Dec 6 03:26:46 localhost puppet-user[55497]: Config retrieval: 0.25 Dec 6 03:26:46 localhost puppet-user[55497]: Service: 1.15 Dec 6 03:26:46 localhost puppet-user[55497]: Transaction evaluation: 15.84 Dec 6 03:26:46 localhost puppet-user[55497]: Catalog application: 15.86 Dec 6 03:26:46 localhost puppet-user[55497]: Last run: 1765009606 Dec 6 03:26:46 localhost puppet-user[55497]: Exec: 5.06 Dec 6 03:26:46 localhost puppet-user[55497]: Package: 9.34 Dec 6 03:26:46 localhost puppet-user[55497]: Total: 15.86 Dec 6 03:26:46 localhost puppet-user[55497]: Version: Dec 6 03:26:46 localhost puppet-user[55497]: Config: 1765009590 Dec 6 03:26:46 localhost puppet-user[55497]: Puppet: 7.10.0 Dec 6 03:26:46 localhost ansible-async_wrapper.py[55477]: Module complete (55477) Dec 6 03:26:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:26:47 localhost podman[56967]: 2025-12-06 08:26:47.551002634 +0000 UTC m=+0.083254076 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, tcib_managed=true, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 6 03:26:47 localhost podman[56967]: 2025-12-06 08:26:47.591375691 +0000 UTC m=+0.123627123 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Dec 6 03:26:47 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:26:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:26:49 localhost podman[56986]: 2025-12-06 08:26:49.543936914 +0000 UTC m=+0.076966079 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, release=1761123044, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public) Dec 6 03:26:49 localhost podman[56986]: 2025-12-06 08:26:49.558115887 +0000 UTC m=+0.091145052 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Dec 6 03:26:49 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:26:49 localhost ansible-async_wrapper.py[55476]: Done in kid B. Dec 6 03:26:56 localhost python3[57020]: ansible-ansible.legacy.async_status Invoked with jid=150027821643.55473 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:26:56 localhost python3[57036]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:26:57 localhost python3[57052]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:26:57 localhost python3[57102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:26:58 localhost python3[57120]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp66osjov1 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:26:58 localhost python3[57150]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:26:59 localhost python3[57253]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 6 03:27:00 localhost python3[57272]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:01 localhost python3[57304]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:27:01 localhost python3[57354]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:27:02 localhost python3[57372]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:02 localhost python3[57434]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:27:03 localhost python3[57452]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:03 localhost python3[57514]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:27:04 localhost python3[57532]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:04 localhost python3[57594]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:27:04 localhost python3[57612]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:05 localhost python3[57642]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:05 localhost systemd[1]: Reloading. Dec 6 03:27:05 localhost systemd-rc-local-generator[57666]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:05 localhost systemd-sysv-generator[57672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:06 localhost python3[57728]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:27:06 localhost python3[57746]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:07 localhost python3[57808]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:27:07 localhost python3[57826]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:07 localhost python3[57856]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:08 localhost systemd[1]: Reloading. Dec 6 03:27:08 localhost systemd-rc-local-generator[57879]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:08 localhost systemd-sysv-generator[57882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:08 localhost systemd[1]: Starting Create netns directory... Dec 6 03:27:08 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 03:27:08 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 03:27:08 localhost systemd[1]: Finished Create netns directory. Dec 6 03:27:08 localhost python3[57913]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 6 03:27:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:27:09 localhost podman[57943]: 2025-12-06 08:27:09.606499439 +0000 UTC m=+0.124878372 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z) Dec 6 03:27:09 localhost podman[57943]: 2025-12-06 08:27:09.807300366 +0000 UTC m=+0.325679279 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1) Dec 6 03:27:09 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:27:10 localhost python3[57998]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 6 03:27:11 localhost podman[58145]: 2025-12-06 08:27:11.286752676 +0000 UTC m=+0.084737611 container create 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044) Dec 6 03:27:11 localhost podman[58161]: 2025-12-06 08:27:11.320050514 +0000 UTC m=+0.099866973 container create b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Dec 6 03:27:11 localhost systemd[1]: Started libpod-conmon-23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.scope. Dec 6 03:27:11 localhost systemd[1]: Started libpod-conmon-b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.scope. Dec 6 03:27:11 localhost podman[58145]: 2025-12-06 08:27:11.253805659 +0000 UTC m=+0.051790614 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 6 03:27:11 localhost podman[58161]: 2025-12-06 08:27:11.252991664 +0000 UTC m=+0.032808133 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Dec 6 03:27:11 localhost podman[58166]: 2025-12-06 08:27:11.360740391 +0000 UTC m=+0.128845546 container create 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z) Dec 6 03:27:11 localhost podman[58163]: 2025-12-06 08:27:11.2653808 +0000 UTC m=+0.041078730 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:27:11 localhost systemd[1]: Started libcrun container. Dec 6 03:27:11 localhost systemd[1]: Started libcrun container. Dec 6 03:27:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2961fd61607b985660b4106c2f39c6dd4b09a3526a1728e9022e8b160d048172/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f828ad68b1c8f42bc27f6cd0d2c93f980f96fb70dac87338f926a291c2167ed/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:11 localhost podman[58166]: 2025-12-06 08:27:11.281397789 +0000 UTC m=+0.049502954 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 6 03:27:11 localhost systemd[1]: Started libpod-conmon-4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.scope. Dec 6 03:27:11 localhost systemd[1]: Started libcrun container. Dec 6 03:27:11 localhost podman[58196]: 2025-12-06 08:27:11.308473223 +0000 UTC m=+0.045341384 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:27:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/979b7277537d5ac546debf05b8fcf4666622be00eb21163fa54e2b404edc7fc7/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:11 localhost podman[58161]: 2025-12-06 08:27:11.410707639 +0000 UTC m=+0.190524098 container init b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z) Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:27:11 localhost podman[58161]: 2025-12-06 08:27:11.441052994 +0000 UTC m=+0.220869453 container start b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 03:27:11 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=52b849225b4338d04445dda705a9a8bc --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:27:11 localhost podman[58145]: 2025-12-06 08:27:11.470037497 +0000 UTC m=+0.268022552 container init 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vcs-type=git, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:27:11 localhost podman[58196]: 2025-12-06 08:27:11.473770424 +0000 UTC m=+0.210638555 container create b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, container_name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, release=1761123044, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:27:11 localhost podman[58166]: 2025-12-06 08:27:11.475140036 +0000 UTC m=+0.243245231 container init 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:27:11 localhost podman[58163]: 2025-12-06 08:27:11.49065698 +0000 UTC m=+0.266354870 container create d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:27:11 localhost podman[58145]: 2025-12-06 08:27:11.519401706 +0000 UTC m=+0.317386681 container start 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=logrotate_crond, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z) Dec 6 03:27:11 localhost systemd[1]: Started libpod-conmon-b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7.scope. Dec 6 03:27:11 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 6 03:27:11 localhost systemd[1]: Started libpod-conmon-d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.scope. Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:27:11 localhost podman[58166]: 2025-12-06 08:27:11.541182035 +0000 UTC m=+0.309287200 container start 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 6 03:27:11 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=52b849225b4338d04445dda705a9a8bc --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 6 03:27:11 localhost systemd[1]: Started libcrun container. Dec 6 03:27:11 localhost systemd[1]: Started libcrun container. Dec 6 03:27:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d969762fb0332c5c36abf270d6236af27b60ab1864e2e74a696d11379e3dcdcb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:11 localhost podman[58196]: 2025-12-06 08:27:11.579699664 +0000 UTC m=+0.316567795 container init b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, container_name=configure_cms_options, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4) Dec 6 03:27:11 localhost podman[58196]: 2025-12-06 08:27:11.595545198 +0000 UTC m=+0.332413329 container start b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, container_name=configure_cms_options, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com) Dec 6 03:27:11 localhost podman[58196]: 2025-12-06 08:27:11.595725084 +0000 UTC m=+0.332593215 container attach b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12) Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:27:11 localhost podman[58163]: 2025-12-06 08:27:11.655573299 +0000 UTC m=+0.431271199 container init d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-type=git, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:27:11 localhost podman[58163]: 2025-12-06 08:27:11.683693005 +0000 UTC m=+0.459390905 container start d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git) Dec 6 03:27:11 localhost ovs-vsctl[58356]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Dec 6 03:27:11 localhost systemd[1]: libpod-b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7.scope: Deactivated successfully. Dec 6 03:27:11 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:27:11 localhost podman[58196]: 2025-12-06 08:27:11.743119027 +0000 UTC m=+0.479987178 container died b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, container_name=configure_cms_options, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 6 03:27:11 localhost podman[58368]: 2025-12-06 08:27:11.785770206 +0000 UTC m=+0.074868195 container cleanup b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=configure_cms_options, batch=17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, distribution-scope=public) Dec 6 03:27:11 localhost systemd[1]: libpod-conmon-b78c3fa31dbb037cf9144a6b55b2dc34daf0992a599f5b5f3853dc2843a4aab7.scope: Deactivated successfully. Dec 6 03:27:11 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765007760 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Dec 6 03:27:11 localhost podman[58265]: 2025-12-06 08:27:11.745606054 +0000 UTC m=+0.218415626 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z) Dec 6 03:27:11 localhost podman[58345]: 2025-12-06 08:27:11.811869539 +0000 UTC m=+0.130518428 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 03:27:11 localhost podman[58265]: 2025-12-06 08:27:11.881151538 +0000 UTC m=+0.353961090 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-type=git) Dec 6 03:27:11 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:27:11 localhost podman[58278]: 2025-12-06 08:27:11.691407845 +0000 UTC m=+0.135511983 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:12:45Z, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 03:27:11 localhost podman[58233]: 2025-12-06 08:27:11.924945333 +0000 UTC m=+0.476511669 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Dec 6 03:27:11 localhost podman[58233]: 2025-12-06 08:27:11.9431509 +0000 UTC m=+0.494717216 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, release=1761123044, managed_by=tripleo_ansible) Dec 6 03:27:11 localhost podman[58233]: unhealthy Dec 6 03:27:11 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:27:11 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Failed with result 'exit-code'. Dec 6 03:27:12 localhost podman[58278]: 2025-12-06 08:27:12.027276711 +0000 UTC m=+0.471380889 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z) Dec 6 03:27:12 localhost podman[58278]: unhealthy Dec 6 03:27:12 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:27:12 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Failed with result 'exit-code'. Dec 6 03:27:12 localhost podman[58479]: 2025-12-06 08:27:12.133498411 +0000 UTC m=+0.096816807 container create 0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, container_name=setup_ovs_manager, io.openshift.expose-services=, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:27:12 localhost podman[58345]: 2025-12-06 08:27:12.145140943 +0000 UTC m=+0.463789882 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_migration_target, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc.) Dec 6 03:27:12 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:27:12 localhost systemd[1]: Started libpod-conmon-0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6.scope. Dec 6 03:27:12 localhost podman[58479]: 2025-12-06 08:27:12.087471017 +0000 UTC m=+0.050789383 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 6 03:27:12 localhost systemd[1]: Started libcrun container. Dec 6 03:27:12 localhost podman[58479]: 2025-12-06 08:27:12.210955985 +0000 UTC m=+0.174274351 container init 0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, distribution-scope=public, container_name=setup_ovs_manager, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:27:12 localhost podman[58479]: 2025-12-06 08:27:12.222878846 +0000 UTC m=+0.186197232 container start 0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:14:25Z) Dec 6 03:27:12 localhost podman[58479]: 2025-12-06 08:27:12.223190676 +0000 UTC m=+0.186509052 container attach 0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true) Dec 6 03:27:12 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Dec 6 03:27:14 localhost ovs-vsctl[58638]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Dec 6 03:27:15 localhost systemd[1]: libpod-0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6.scope: Deactivated successfully. Dec 6 03:27:15 localhost systemd[1]: libpod-0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6.scope: Consumed 2.954s CPU time. Dec 6 03:27:15 localhost podman[58479]: 2025-12-06 08:27:15.186191794 +0000 UTC m=+3.149510210 container died 0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, container_name=setup_ovs_manager, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public) Dec 6 03:27:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6-userdata-shm.mount: Deactivated successfully. Dec 6 03:27:15 localhost systemd[1]: var-lib-containers-storage-overlay-48726255e62776d5b742d7b54e1e555d0f1c3526c4c6093c52d86746c93a3dd7-merged.mount: Deactivated successfully. Dec 6 03:27:15 localhost podman[58639]: 2025-12-06 08:27:15.278965545 +0000 UTC m=+0.080182100 container cleanup 0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=setup_ovs_manager, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Dec 6 03:27:15 localhost systemd[1]: libpod-conmon-0c8f28deec60d5ede8a0624b3f40a8a172780b5b4daa1f82143dd32c1fe882a6.scope: Deactivated successfully. Dec 6 03:27:15 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765007760 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765007760'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Dec 6 03:27:15 localhost podman[58750]: 2025-12-06 08:27:15.749733354 +0000 UTC m=+0.083375629 container create 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, build-date=2025-11-18T23:34:05Z, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:27:15 localhost podman[58749]: 2025-12-06 08:27:15.777423297 +0000 UTC m=+0.113079814 container create 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:27:15 localhost podman[58750]: 2025-12-06 08:27:15.703006358 +0000 UTC m=+0.036648643 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 6 03:27:15 localhost systemd[1]: Started libpod-conmon-2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.scope. Dec 6 03:27:15 localhost podman[58749]: 2025-12-06 08:27:15.708701805 +0000 UTC m=+0.044358352 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 6 03:27:15 localhost systemd[1]: Started libpod-conmon-2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.scope. Dec 6 03:27:15 localhost systemd[1]: Started libcrun container. Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4e58f94b8958f513b2dd393cee5d54d098de4336f81d16f69b3743ebfd6afda/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4e58f94b8958f513b2dd393cee5d54d098de4336f81d16f69b3743ebfd6afda/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4e58f94b8958f513b2dd393cee5d54d098de4336f81d16f69b3743ebfd6afda/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4e58f94b8958f513b2dd393cee5d54d098de4336f81d16f69b3743ebfd6afda/merged/etc/pki/tls/private/ovn_controller.key supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4e58f94b8958f513b2dd393cee5d54d098de4336f81d16f69b3743ebfd6afda/merged/etc/pki/tls/certs/ovn_controller.crt supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost systemd[1]: Started libcrun container. Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b249cf7e6e009bf19a7258344bcf98894c9eab8ad3921e68b5bede5938188687/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b249cf7e6e009bf19a7258344bcf98894c9eab8ad3921e68b5bede5938188687/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b249cf7e6e009bf19a7258344bcf98894c9eab8ad3921e68b5bede5938188687/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b249cf7e6e009bf19a7258344bcf98894c9eab8ad3921e68b5bede5938188687/merged/etc/pki/tls/certs/ovn_metadata.crt supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b249cf7e6e009bf19a7258344bcf98894c9eab8ad3921e68b5bede5938188687/merged/etc/pki/tls/private/ovn_metadata.key supports timestamps until 2038 (0x7fffffff) Dec 6 03:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:27:15 localhost podman[58750]: 2025-12-06 08:27:15.854965794 +0000 UTC m=+0.188607999 container init 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Dec 6 03:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:27:15 localhost podman[58749]: 2025-12-06 08:27:15.873345436 +0000 UTC m=+0.209002013 container init 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:27:15 localhost podman[58750]: 2025-12-06 08:27:15.906019634 +0000 UTC m=+0.239661859 container start 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=) Dec 6 03:27:15 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt --volume /etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 6 03:27:15 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:27:15 localhost podman[58749]: 2025-12-06 08:27:15.923146498 +0000 UTC m=+0.258803035 container start 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, maintainer=OpenStack TripleO Team) Dec 6 03:27:15 localhost systemd[1]: Created slice User Slice of UID 0. Dec 6 03:27:15 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 6 03:27:15 localhost python3[57998]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=78ca993e795bb2768fe880e03926b595 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt --volume /etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 6 03:27:15 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 6 03:27:15 localhost systemd[1]: Starting User Manager for UID 0... Dec 6 03:27:16 localhost podman[58792]: 2025-12-06 08:27:16.068002682 +0000 UTC m=+0.146889259 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, distribution-scope=public, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, tcib_managed=true, version=17.1.12) Dec 6 03:27:16 localhost podman[58796]: 2025-12-06 08:27:16.045306785 +0000 UTC m=+0.104930642 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:27:16 localhost podman[58792]: 2025-12-06 08:27:16.110260478 +0000 UTC m=+0.189146965 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public) Dec 6 03:27:16 localhost systemd[58819]: Queued start job for default target Main User Target. Dec 6 03:27:16 localhost systemd[58819]: Created slice User Application Slice. Dec 6 03:27:16 localhost systemd[58819]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 6 03:27:16 localhost systemd[58819]: Started Daily Cleanup of User's Temporary Directories. Dec 6 03:27:16 localhost systemd[58819]: Reached target Paths. Dec 6 03:27:16 localhost systemd[58819]: Reached target Timers. Dec 6 03:27:16 localhost systemd[58819]: Starting D-Bus User Message Bus Socket... Dec 6 03:27:16 localhost podman[58792]: unhealthy Dec 6 03:27:16 localhost systemd[58819]: Starting Create User's Volatile Files and Directories... Dec 6 03:27:16 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:27:16 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:27:16 localhost systemd[58819]: Listening on D-Bus User Message Bus Socket. Dec 6 03:27:16 localhost systemd[58819]: Reached target Sockets. Dec 6 03:27:16 localhost systemd[58819]: Finished Create User's Volatile Files and Directories. Dec 6 03:27:16 localhost systemd[58819]: Reached target Basic System. Dec 6 03:27:16 localhost systemd[58819]: Reached target Main User Target. Dec 6 03:27:16 localhost systemd[58819]: Startup finished in 130ms. Dec 6 03:27:16 localhost systemd[1]: Started User Manager for UID 0. Dec 6 03:27:16 localhost systemd[1]: Started Session c9 of User root. Dec 6 03:27:16 localhost podman[58796]: 2025-12-06 08:27:16.180747705 +0000 UTC m=+0.240371612 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044) Dec 6 03:27:16 localhost podman[58796]: unhealthy Dec 6 03:27:16 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:27:16 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:27:16 localhost systemd[1]: session-c9.scope: Deactivated successfully. Dec 6 03:27:16 localhost kernel: device br-int entered promiscuous mode Dec 6 03:27:16 localhost NetworkManager[5965]: [1765009636.2840] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Dec 6 03:27:16 localhost systemd-udevd[58900]: Network interface NamePolicy= disabled on kernel command line. Dec 6 03:27:16 localhost kernel: device genev_sys_6081 entered promiscuous mode Dec 6 03:27:16 localhost NetworkManager[5965]: [1765009636.3324] device (genev_sys_6081): carrier: link connected Dec 6 03:27:16 localhost NetworkManager[5965]: [1765009636.3330] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Dec 6 03:27:16 localhost python3[58922]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:16 localhost python3[58938]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:17 localhost python3[58954]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:17 localhost python3[58970]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:27:17 localhost python3[58986]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:17 localhost podman[58990]: 2025-12-06 08:27:17.754620447 +0000 UTC m=+0.060879098 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:27:17 localhost podman[58990]: 2025-12-06 08:27:17.763039699 +0000 UTC m=+0.069298380 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:27:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:27:17 localhost python3[59025]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:18 localhost python3[59041]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:27:18 localhost python3[59059]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:27:18 localhost python3[59077]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:27:19 localhost python3[59093]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:27:19 localhost python3[59109]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:27:19 localhost python3[59125]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:27:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:27:20 localhost systemd[1]: tmp-crun.AeLFqA.mount: Deactivated successfully. Dec 6 03:27:20 localhost podman[59186]: 2025-12-06 08:27:20.14079823 +0000 UTC m=+0.101702279 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git) Dec 6 03:27:20 localhost podman[59186]: 2025-12-06 08:27:20.213270519 +0000 UTC m=+0.174174528 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Dec 6 03:27:20 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:27:20 localhost python3[59187]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009639.6008673-141517-23984358648500/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:20 localhost python3[59232]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009639.6008673-141517-23984358648500/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:21 localhost python3[59261]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009639.6008673-141517-23984358648500/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:21 localhost python3[59291]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009639.6008673-141517-23984358648500/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:22 localhost python3[59320]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009639.6008673-141517-23984358648500/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:22 localhost python3[59349]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009639.6008673-141517-23984358648500/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:23 localhost python3[59365]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 03:27:23 localhost systemd[1]: Reloading. Dec 6 03:27:23 localhost systemd-rc-local-generator[59388]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:23 localhost systemd-sysv-generator[59394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:24 localhost python3[59417]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:24 localhost systemd[1]: Reloading. Dec 6 03:27:24 localhost systemd-rc-local-generator[59443]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:24 localhost systemd-sysv-generator[59446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:24 localhost systemd[1]: Starting ceilometer_agent_compute container... Dec 6 03:27:24 localhost tripleo-start-podman-container[59457]: Creating additional drop-in dependency for "ceilometer_agent_compute" (b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d) Dec 6 03:27:24 localhost systemd[1]: Reloading. Dec 6 03:27:24 localhost systemd-rc-local-generator[59517]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:24 localhost systemd-sysv-generator[59521]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:25 localhost systemd[1]: Started ceilometer_agent_compute container. Dec 6 03:27:25 localhost python3[59542]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:25 localhost systemd[1]: Reloading. Dec 6 03:27:26 localhost systemd-sysv-generator[59574]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:26 localhost systemd-rc-local-generator[59567]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:26 localhost systemd[1]: Stopping User Manager for UID 0... Dec 6 03:27:26 localhost systemd[58819]: Activating special unit Exit the Session... Dec 6 03:27:26 localhost systemd[58819]: Stopped target Main User Target. Dec 6 03:27:26 localhost systemd[58819]: Stopped target Basic System. Dec 6 03:27:26 localhost systemd[58819]: Stopped target Paths. Dec 6 03:27:26 localhost systemd[58819]: Stopped target Sockets. Dec 6 03:27:26 localhost systemd[58819]: Stopped target Timers. Dec 6 03:27:26 localhost systemd[58819]: Stopped Daily Cleanup of User's Temporary Directories. Dec 6 03:27:26 localhost systemd[58819]: Closed D-Bus User Message Bus Socket. Dec 6 03:27:26 localhost systemd[58819]: Stopped Create User's Volatile Files and Directories. Dec 6 03:27:26 localhost systemd[58819]: Removed slice User Application Slice. Dec 6 03:27:26 localhost systemd[58819]: Reached target Shutdown. Dec 6 03:27:26 localhost systemd[58819]: Finished Exit the Session. Dec 6 03:27:26 localhost systemd[58819]: Reached target Exit the Session. Dec 6 03:27:26 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Dec 6 03:27:26 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 6 03:27:26 localhost systemd[1]: Stopped User Manager for UID 0. Dec 6 03:27:26 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 6 03:27:26 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 6 03:27:26 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 6 03:27:26 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 6 03:27:26 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 6 03:27:26 localhost systemd[1]: Started ceilometer_agent_ipmi container. Dec 6 03:27:26 localhost python3[59612]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:27 localhost systemd[1]: Reloading. Dec 6 03:27:28 localhost systemd-sysv-generator[59642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:28 localhost systemd-rc-local-generator[59639]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:28 localhost systemd[1]: Starting logrotate_crond container... Dec 6 03:27:28 localhost systemd[1]: Started logrotate_crond container. Dec 6 03:27:29 localhost python3[59680]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:29 localhost systemd[1]: Reloading. Dec 6 03:27:29 localhost systemd-rc-local-generator[59708]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:29 localhost systemd-sysv-generator[59711]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:29 localhost systemd[1]: Starting nova_migration_target container... Dec 6 03:27:29 localhost systemd[1]: Started nova_migration_target container. Dec 6 03:27:30 localhost python3[59748]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:30 localhost systemd[1]: Reloading. Dec 6 03:27:30 localhost systemd-sysv-generator[59777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:30 localhost systemd-rc-local-generator[59771]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:30 localhost systemd[1]: Starting ovn_controller container... Dec 6 03:27:30 localhost tripleo-start-podman-container[59787]: Creating additional drop-in dependency for "ovn_controller" (2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120) Dec 6 03:27:30 localhost systemd[1]: Reloading. Dec 6 03:27:30 localhost systemd-sysv-generator[59847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:30 localhost systemd-rc-local-generator[59844]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:30 localhost systemd[1]: Started ovn_controller container. Dec 6 03:27:31 localhost python3[59872]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:27:31 localhost systemd[1]: Reloading. Dec 6 03:27:31 localhost systemd-rc-local-generator[59901]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:27:31 localhost systemd-sysv-generator[59906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:27:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:27:32 localhost systemd[1]: Starting dnf makecache... Dec 6 03:27:32 localhost systemd[1]: Starting ovn_metadata_agent container... Dec 6 03:27:32 localhost systemd[1]: Started ovn_metadata_agent container. Dec 6 03:27:32 localhost dnf[59912]: Updating Subscription Management repositories. Dec 6 03:27:32 localhost python3[59954]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:34 localhost dnf[59912]: Metadata cache refreshed recently. Dec 6 03:27:34 localhost python3[60075]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005548798 step=4 update_config_hash_only=False Dec 6 03:27:34 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Dec 6 03:27:34 localhost systemd[1]: Finished dnf makecache. Dec 6 03:27:34 localhost systemd[1]: dnf-makecache.service: Consumed 2.102s CPU time. Dec 6 03:27:34 localhost python3[60093]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:27:35 localhost python3[60109]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 6 03:27:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:27:40 localhost podman[60110]: 2025-12-06 08:27:40.589757933 +0000 UTC m=+0.116531503 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12) Dec 6 03:27:40 localhost podman[60110]: 2025-12-06 08:27:40.81807517 +0000 UTC m=+0.344848700 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team) Dec 6 03:27:40 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:27:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:27:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:27:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:27:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:27:42 localhost podman[60142]: 2025-12-06 08:27:42.577520809 +0000 UTC m=+0.096912555 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute) Dec 6 03:27:42 localhost podman[60141]: 2025-12-06 08:27:42.55078801 +0000 UTC m=+0.080983461 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 6 03:27:42 localhost podman[60140]: 2025-12-06 08:27:42.609815799 +0000 UTC m=+0.139343190 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4) Dec 6 03:27:42 localhost podman[60139]: 2025-12-06 08:27:42.668657943 +0000 UTC m=+0.200597228 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:27:42 localhost podman[60141]: 2025-12-06 08:27:42.686899209 +0000 UTC m=+0.217094660 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container) Dec 6 03:27:42 localhost podman[60139]: 2025-12-06 08:27:42.702805381 +0000 UTC m=+0.234744636 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:27:42 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:27:42 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:27:42 localhost podman[60140]: 2025-12-06 08:27:42.742114209 +0000 UTC m=+0.271641640 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, version=17.1.12, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com) Dec 6 03:27:42 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:27:42 localhost podman[60142]: 2025-12-06 08:27:42.957619698 +0000 UTC m=+0.477011454 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:27:42 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:27:43 localhost systemd[1]: tmp-crun.8k6hiC.mount: Deactivated successfully. Dec 6 03:27:45 localhost snmpd[56894]: empty variable list in _query Dec 6 03:27:45 localhost snmpd[56894]: empty variable list in _query Dec 6 03:27:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:27:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:27:46 localhost podman[60234]: 2025-12-06 08:27:46.563765513 +0000 UTC m=+0.092806719 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ovn_metadata_agent, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:27:46 localhost podman[60235]: 2025-12-06 08:27:46.622008067 +0000 UTC m=+0.149529365 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Dec 6 03:27:46 localhost podman[60234]: 2025-12-06 08:27:46.641351797 +0000 UTC m=+0.170392933 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:27:46 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:27:46 localhost podman[60235]: 2025-12-06 08:27:46.695675671 +0000 UTC m=+0.223196969 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, release=1761123044, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public) Dec 6 03:27:46 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:27:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:27:48 localhost podman[60281]: 2025-12-06 08:27:48.534666902 +0000 UTC m=+0.070492726 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:27:48 localhost podman[60281]: 2025-12-06 08:27:48.541083381 +0000 UTC m=+0.076909135 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com) Dec 6 03:27:48 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:27:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:27:50 localhost podman[60302]: 2025-12-06 08:27:50.540745172 +0000 UTC m=+0.076202083 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:27:50 localhost podman[60302]: 2025-12-06 08:27:50.549800083 +0000 UTC m=+0.085256983 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 6 03:27:50 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:28:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:28:11 localhost podman[60321]: 2025-12-06 08:28:11.531911489 +0000 UTC m=+0.068982908 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Dec 6 03:28:11 localhost podman[60321]: 2025-12-06 08:28:11.752386272 +0000 UTC m=+0.289457631 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com) Dec 6 03:28:11 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:28:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:28:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:28:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:28:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:28:13 localhost podman[60348]: 2025-12-06 08:28:13.55253798 +0000 UTC m=+0.086431040 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4) Dec 6 03:28:13 localhost podman[60348]: 2025-12-06 08:28:13.589163015 +0000 UTC m=+0.123056025 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc.) Dec 6 03:28:13 localhost podman[60350]: 2025-12-06 08:28:13.603954833 +0000 UTC m=+0.130786674 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z) Dec 6 03:28:13 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:28:13 localhost podman[60357]: 2025-12-06 08:28:13.664393107 +0000 UTC m=+0.190374471 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container) Dec 6 03:28:13 localhost podman[60350]: 2025-12-06 08:28:13.665223142 +0000 UTC m=+0.192054983 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 03:28:13 localhost podman[60349]: 2025-12-06 08:28:13.525688648 +0000 UTC m=+0.060658172 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:28:13 localhost podman[60349]: 2025-12-06 08:28:13.710277009 +0000 UTC m=+0.245246563 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:28:13 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:28:13 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:28:14 localhost podman[60357]: 2025-12-06 08:28:14.053370133 +0000 UTC m=+0.579351457 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container) Dec 6 03:28:14 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:28:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:28:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:28:17 localhost podman[60441]: 2025-12-06 08:28:17.546336892 +0000 UTC m=+0.081505257 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true) Dec 6 03:28:17 localhost podman[60442]: 2025-12-06 08:28:17.601368908 +0000 UTC m=+0.132600451 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, architecture=x86_64, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:28:17 localhost systemd[1]: tmp-crun.RRSfaf.mount: Deactivated successfully. Dec 6 03:28:17 localhost podman[60441]: 2025-12-06 08:28:17.612456651 +0000 UTC m=+0.147624986 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent) Dec 6 03:28:17 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:28:17 localhost podman[60442]: 2025-12-06 08:28:17.649314134 +0000 UTC m=+0.180545677 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z) Dec 6 03:28:17 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:28:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:28:19 localhost podman[60489]: 2025-12-06 08:28:19.552551846 +0000 UTC m=+0.085855261 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, release=1761123044, container_name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 6 03:28:19 localhost podman[60489]: 2025-12-06 08:28:19.565220499 +0000 UTC m=+0.098523904 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Dec 6 03:28:19 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:28:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:28:21 localhost systemd[1]: tmp-crun.26AtBi.mount: Deactivated successfully. Dec 6 03:28:21 localhost podman[60510]: 2025-12-06 08:28:21.557919125 +0000 UTC m=+0.092583971 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:28:21 localhost podman[60510]: 2025-12-06 08:28:21.595387737 +0000 UTC m=+0.130052563 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12) Dec 6 03:28:21 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:28:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:28:42 localhost podman[60530]: 2025-12-06 08:28:42.544109138 +0000 UTC m=+0.079315750 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.buildah.version=1.41.4) Dec 6 03:28:42 localhost podman[60530]: 2025-12-06 08:28:42.715228792 +0000 UTC m=+0.250435384 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z) Dec 6 03:28:42 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:28:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:28:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:28:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:28:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:28:44 localhost podman[60562]: 2025-12-06 08:28:44.561944421 +0000 UTC m=+0.083311233 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:28:44 localhost podman[60560]: 2025-12-06 08:28:44.611449246 +0000 UTC m=+0.137932107 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:28:44 localhost podman[60560]: 2025-12-06 08:28:44.640189097 +0000 UTC m=+0.166671948 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:28:44 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:28:44 localhost podman[60559]: 2025-12-06 08:28:44.658621978 +0000 UTC m=+0.189381711 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com) Dec 6 03:28:44 localhost podman[60559]: 2025-12-06 08:28:44.669076572 +0000 UTC m=+0.199836365 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 03:28:44 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:28:44 localhost podman[60561]: 2025-12-06 08:28:44.72676906 +0000 UTC m=+0.250172405 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:28:44 localhost podman[60561]: 2025-12-06 08:28:44.752220959 +0000 UTC m=+0.275624294 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:28:44 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:28:44 localhost podman[60562]: 2025-12-06 08:28:44.963353223 +0000 UTC m=+0.484720075 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:28:44 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:28:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:28:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:28:48 localhost podman[60654]: 2025-12-06 08:28:48.568570913 +0000 UTC m=+0.101558319 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:28:48 localhost podman[60655]: 2025-12-06 08:28:48.605971152 +0000 UTC m=+0.137411740 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:28:48 localhost podman[60654]: 2025-12-06 08:28:48.620180213 +0000 UTC m=+0.153167639 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:28:48 localhost podman[60655]: 2025-12-06 08:28:48.632283768 +0000 UTC m=+0.163724376 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Dec 6 03:28:48 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:28:48 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:28:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:28:50 localhost podman[60702]: 2025-12-06 08:28:50.55100533 +0000 UTC m=+0.084761179 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, release=1761123044, io.openshift.expose-services=) Dec 6 03:28:50 localhost podman[60702]: 2025-12-06 08:28:50.566397736 +0000 UTC m=+0.100153595 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:28:50 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:28:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:28:52 localhost podman[60722]: 2025-12-06 08:28:52.553040014 +0000 UTC m=+0.085860492 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, release=1761123044, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Dec 6 03:28:52 localhost podman[60722]: 2025-12-06 08:28:52.58968535 +0000 UTC m=+0.122505878 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:28:52 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:29:13 localhost podman[60741]: 2025-12-06 08:29:13.551156244 +0000 UTC m=+0.085954795 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd) Dec 6 03:29:13 localhost podman[60741]: 2025-12-06 08:29:13.765345713 +0000 UTC m=+0.300144284 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:29:13 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:29:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:29:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:29:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:29:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:29:15 localhost systemd[1]: tmp-crun.kXDYdB.mount: Deactivated successfully. Dec 6 03:29:15 localhost systemd[1]: tmp-crun.2HhKZb.mount: Deactivated successfully. Dec 6 03:29:15 localhost podman[60774]: 2025-12-06 08:29:15.562526189 +0000 UTC m=+0.083691035 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:29:15 localhost podman[60771]: 2025-12-06 08:29:15.546046728 +0000 UTC m=+0.078358560 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=logrotate_crond, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public) Dec 6 03:29:15 localhost podman[60773]: 2025-12-06 08:29:15.609504355 +0000 UTC m=+0.135706768 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, container_name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=) Dec 6 03:29:15 localhost podman[60772]: 2025-12-06 08:29:15.657132981 +0000 UTC m=+0.187326467 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 03:29:15 localhost podman[60771]: 2025-12-06 08:29:15.676653847 +0000 UTC m=+0.208965709 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.expose-services=, container_name=logrotate_crond, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 6 03:29:15 localhost podman[60773]: 2025-12-06 08:29:15.683627092 +0000 UTC m=+0.209829545 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:29:15 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:29:15 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:29:15 localhost podman[60772]: 2025-12-06 08:29:15.734636263 +0000 UTC m=+0.264829759 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi) Dec 6 03:29:15 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:29:15 localhost podman[60774]: 2025-12-06 08:29:15.908272225 +0000 UTC m=+0.429437121 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, container_name=nova_migration_target, version=17.1.12) Dec 6 03:29:15 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:29:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:29:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:29:19 localhost podman[60868]: 2025-12-06 08:29:19.539340845 +0000 UTC m=+0.071843478 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, tcib_managed=true, release=1761123044, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git) Dec 6 03:29:19 localhost podman[60868]: 2025-12-06 08:29:19.563806493 +0000 UTC m=+0.096309126 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044) Dec 6 03:29:19 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:29:19 localhost systemd[1]: tmp-crun.asDxTr.mount: Deactivated successfully. Dec 6 03:29:19 localhost podman[60867]: 2025-12-06 08:29:19.660408528 +0000 UTC m=+0.194634655 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:29:19 localhost podman[60867]: 2025-12-06 08:29:19.699884171 +0000 UTC m=+0.234110288 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_metadata_agent) Dec 6 03:29:19 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:29:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:29:21 localhost podman[60916]: 2025-12-06 08:29:21.524029663 +0000 UTC m=+0.059899837 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:29:21 localhost podman[60916]: 2025-12-06 08:29:21.534281201 +0000 UTC m=+0.070151405 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-collectd) Dec 6 03:29:21 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:29:23 localhost podman[60936]: 2025-12-06 08:29:23.5828888 +0000 UTC m=+0.080716093 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 6 03:29:23 localhost podman[60936]: 2025-12-06 08:29:23.596238484 +0000 UTC m=+0.094065777 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, io.buildah.version=1.41.4) Dec 6 03:29:23 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:29:44 localhost podman[60954]: 2025-12-06 08:29:44.518769909 +0000 UTC m=+0.059149984 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:29:44 localhost podman[60954]: 2025-12-06 08:29:44.720982637 +0000 UTC m=+0.261362712 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1) Dec 6 03:29:44 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:29:45 localhost systemd[1]: session-14.scope: Deactivated successfully. Dec 6 03:29:45 localhost systemd[1]: session-14.scope: Consumed 3.024s CPU time. Dec 6 03:29:45 localhost systemd-logind[760]: Session 14 logged out. Waiting for processes to exit. Dec 6 03:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:29:45 localhost systemd-logind[760]: Removed session 14. Dec 6 03:29:45 localhost podman[60984]: 2025-12-06 08:29:45.952770571 +0000 UTC m=+0.072415676 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-cron-container) Dec 6 03:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:29:45 localhost podman[60984]: 2025-12-06 08:29:45.979175064 +0000 UTC m=+0.098820149 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, version=17.1.12) Dec 6 03:29:46 localhost podman[60985]: 2025-12-06 08:29:46.018900332 +0000 UTC m=+0.136772552 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git) Dec 6 03:29:46 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:29:46 localhost podman[60986]: 2025-12-06 08:29:46.125283185 +0000 UTC m=+0.240625636 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:29:46 localhost podman[60985]: 2025-12-06 08:29:46.130157687 +0000 UTC m=+0.248029897 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi) Dec 6 03:29:46 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:29:46 localhost podman[61023]: 2025-12-06 08:29:46.181327201 +0000 UTC m=+0.200226848 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:29:46 localhost podman[60986]: 2025-12-06 08:29:46.232479494 +0000 UTC m=+0.347821945 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:11:48Z, release=1761123044, version=17.1.12) Dec 6 03:29:46 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:29:46 localhost podman[61023]: 2025-12-06 08:29:46.548482816 +0000 UTC m=+0.567382423 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container) Dec 6 03:29:46 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:29:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:29:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:29:50 localhost podman[61078]: 2025-12-06 08:29:50.531051995 +0000 UTC m=+0.065660966 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1) Dec 6 03:29:50 localhost podman[61078]: 2025-12-06 08:29:50.552233464 +0000 UTC m=+0.086842365 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller) Dec 6 03:29:50 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:29:50 localhost podman[61077]: 2025-12-06 08:29:50.6403604 +0000 UTC m=+0.178196792 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:29:50 localhost podman[61077]: 2025-12-06 08:29:50.704443515 +0000 UTC m=+0.242279917 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:29:50 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:29:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:29:52 localhost podman[61125]: 2025-12-06 08:29:52.527595987 +0000 UTC m=+0.054262350 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, container_name=collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4) Dec 6 03:29:52 localhost podman[61125]: 2025-12-06 08:29:52.562236987 +0000 UTC m=+0.088903300 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3) Dec 6 03:29:52 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:29:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:29:54 localhost podman[61146]: 2025-12-06 08:29:54.555680423 +0000 UTC m=+0.084344267 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:29:54 localhost podman[61146]: 2025-12-06 08:29:54.59345554 +0000 UTC m=+0.122119354 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Dec 6 03:29:54 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:29:58 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:29:58 localhost recover_tripleo_nova_virtqemud[61214]: 51836 Dec 6 03:29:58 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:29:58 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:29:58 localhost python3[61213]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:29:59 localhost python3[61259]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009798.5473392-146898-159714740323399/source _original_basename=tmpsrua22x6 follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:00 localhost python3[61289]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:30:02 localhost ansible-async_wrapper.py[61461]: Invoked with 474232680127 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009801.614614-147175-122379869506776/AnsiballZ_command.py _ Dec 6 03:30:02 localhost ansible-async_wrapper.py[61464]: Starting module and watcher Dec 6 03:30:02 localhost ansible-async_wrapper.py[61464]: Start watching 61465 (3600) Dec 6 03:30:02 localhost ansible-async_wrapper.py[61465]: Start module (61465) Dec 6 03:30:02 localhost ansible-async_wrapper.py[61461]: Return async_wrapper task started. Dec 6 03:30:02 localhost python3[61485]: ansible-ansible.legacy.async_status Invoked with jid=474232680127.61461 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:30:07 localhost ansible-async_wrapper.py[61464]: 61465 still running (3600) Dec 6 03:30:07 localhost puppet-user[61483]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 6 03:30:07 localhost puppet-user[61483]: (file: /etc/puppet/hiera.yaml) Dec 6 03:30:07 localhost puppet-user[61483]: Warning: Undefined variable '::deploy_config_name'; Dec 6 03:30:07 localhost puppet-user[61483]: (file & line not available) Dec 6 03:30:08 localhost puppet-user[61483]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 6 03:30:08 localhost puppet-user[61483]: (file & line not available) Dec 6 03:30:08 localhost puppet-user[61483]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 6 03:30:08 localhost puppet-user[61483]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:30:08 localhost puppet-user[61483]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:30:08 localhost puppet-user[61483]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:30:08 localhost puppet-user[61483]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:30:08 localhost puppet-user[61483]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:30:08 localhost puppet-user[61483]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:30:08 localhost puppet-user[61483]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:30:08 localhost puppet-user[61483]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:30:08 localhost puppet-user[61483]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:30:08 localhost puppet-user[61483]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:30:08 localhost puppet-user[61483]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:30:08 localhost puppet-user[61483]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:30:08 localhost puppet-user[61483]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:30:08 localhost puppet-user[61483]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:30:08 localhost puppet-user[61483]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 6 03:30:08 localhost puppet-user[61483]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 5] Dec 6 03:30:08 localhost puppet-user[61483]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 6 03:30:08 localhost puppet-user[61483]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 6 03:30:08 localhost puppet-user[61483]: Notice: Compiled catalog for np0005548798.ooo.test in environment production in 0.30 seconds Dec 6 03:30:08 localhost puppet-user[61483]: Notice: Applied catalog in 0.29 seconds Dec 6 03:30:08 localhost puppet-user[61483]: Application: Dec 6 03:30:08 localhost puppet-user[61483]: Initial environment: production Dec 6 03:30:08 localhost puppet-user[61483]: Converged environment: production Dec 6 03:30:08 localhost puppet-user[61483]: Run mode: user Dec 6 03:30:08 localhost puppet-user[61483]: Changes: Dec 6 03:30:08 localhost puppet-user[61483]: Events: Dec 6 03:30:08 localhost puppet-user[61483]: Resources: Dec 6 03:30:08 localhost puppet-user[61483]: Total: 19 Dec 6 03:30:08 localhost puppet-user[61483]: Time: Dec 6 03:30:08 localhost puppet-user[61483]: Filebucket: 0.00 Dec 6 03:30:08 localhost puppet-user[61483]: Schedule: 0.00 Dec 6 03:30:08 localhost puppet-user[61483]: Package: 0.00 Dec 6 03:30:08 localhost puppet-user[61483]: Exec: 0.01 Dec 6 03:30:08 localhost puppet-user[61483]: Augeas: 0.01 Dec 6 03:30:08 localhost puppet-user[61483]: File: 0.03 Dec 6 03:30:08 localhost puppet-user[61483]: Service: 0.06 Dec 6 03:30:08 localhost puppet-user[61483]: Transaction evaluation: 0.28 Dec 6 03:30:08 localhost puppet-user[61483]: Catalog application: 0.29 Dec 6 03:30:08 localhost puppet-user[61483]: Config retrieval: 0.37 Dec 6 03:30:08 localhost puppet-user[61483]: Last run: 1765009808 Dec 6 03:30:08 localhost puppet-user[61483]: Total: 0.30 Dec 6 03:30:08 localhost puppet-user[61483]: Version: Dec 6 03:30:08 localhost puppet-user[61483]: Config: 1765009807 Dec 6 03:30:08 localhost puppet-user[61483]: Puppet: 7.10.0 Dec 6 03:30:08 localhost ansible-async_wrapper.py[61465]: Module complete (61465) Dec 6 03:30:12 localhost ansible-async_wrapper.py[61464]: Done in kid B. Dec 6 03:30:12 localhost python3[61627]: ansible-ansible.legacy.async_status Invoked with jid=474232680127.61461 mode=status _async_dir=/tmp/.ansible_async Dec 6 03:30:13 localhost python3[61643]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:30:13 localhost python3[61659]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:30:14 localhost python3[61709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:30:14 localhost python3[61727]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp_pmyanqu recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 6 03:30:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:30:15 localhost podman[61758]: 2025-12-06 08:30:15.066363622 +0000 UTC m=+0.098267452 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 6 03:30:15 localhost python3[61757]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:15 localhost podman[61758]: 2025-12-06 08:30:15.288106898 +0000 UTC m=+0.320010768 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4) Dec 6 03:30:15 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:30:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:30:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:30:16 localhost systemd[1]: tmp-crun.xBR0Rd.mount: Deactivated successfully. Dec 6 03:30:16 localhost podman[61894]: 2025-12-06 08:30:16.205286494 +0000 UTC m=+0.111302937 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:30:16 localhost podman[61894]: 2025-12-06 08:30:16.240900293 +0000 UTC m=+0.146916676 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 6 03:30:16 localhost python3[61893]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 6 03:30:16 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:30:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:30:16 localhost podman[61911]: 2025-12-06 08:30:16.309614754 +0000 UTC m=+0.096352623 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Dec 6 03:30:16 localhost podman[61911]: 2025-12-06 08:30:16.326116497 +0000 UTC m=+0.112854396 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_ipmi, vcs-type=git, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 03:30:16 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:30:16 localhost podman[61928]: 2025-12-06 08:30:16.389671766 +0000 UTC m=+0.112246496 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:30:16 localhost podman[61928]: 2025-12-06 08:30:16.416268595 +0000 UTC m=+0.138843305 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, distribution-scope=public, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12) Dec 6 03:30:16 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:30:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:30:16 localhost podman[61985]: 2025-12-06 08:30:16.975223334 +0000 UTC m=+0.091627825 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, tcib_managed=true) Dec 6 03:30:17 localhost python3[61986]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:17 localhost systemd[1]: tmp-crun.nyc2LE.mount: Deactivated successfully. Dec 6 03:30:17 localhost podman[61985]: 2025-12-06 08:30:17.367915485 +0000 UTC m=+0.484319876 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_migration_target) Dec 6 03:30:17 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:30:17 localhost python3[62040]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:30:18 localhost python3[62090]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:30:18 localhost python3[62108]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:19 localhost python3[62170]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:30:19 localhost python3[62188]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:20 localhost python3[62250]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:30:20 localhost python3[62268]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:30:21 localhost podman[62332]: 2025-12-06 08:30:21.015138641 +0000 UTC m=+0.078834786 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=) Dec 6 03:30:21 localhost python3[62330]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:30:21 localhost podman[62332]: 2025-12-06 08:30:21.069960149 +0000 UTC m=+0.133656294 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z) Dec 6 03:30:21 localhost podman[62331]: 2025-12-06 08:30:21.078137074 +0000 UTC m=+0.142891052 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:30:21 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:30:21 localhost podman[62331]: 2025-12-06 08:30:21.107488457 +0000 UTC m=+0.172242445 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, url=https://www.redhat.com) Dec 6 03:30:21 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:30:21 localhost python3[62396]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:21 localhost python3[62426]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:30:21 localhost systemd[1]: Reloading. Dec 6 03:30:21 localhost systemd-rc-local-generator[62450]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:30:22 localhost systemd-sysv-generator[62453]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:30:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:30:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:30:22 localhost systemd[1]: tmp-crun.d6sJZw.mount: Deactivated successfully. Dec 6 03:30:22 localhost podman[62512]: 2025-12-06 08:30:22.783986693 +0000 UTC m=+0.097508898 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Dec 6 03:30:22 localhost podman[62512]: 2025-12-06 08:30:22.793446128 +0000 UTC m=+0.106968353 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:30:22 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:30:22 localhost python3[62513]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:30:23 localhost python3[62550]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:23 localhost python3[62612]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 6 03:30:24 localhost python3[62630]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:24 localhost python3[62660]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:30:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:30:24 localhost systemd[1]: Reloading. Dec 6 03:30:24 localhost podman[62662]: 2025-12-06 08:30:24.752861065 +0000 UTC m=+0.090485370 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.12) Dec 6 03:30:24 localhost systemd-rc-local-generator[62705]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:30:24 localhost systemd-sysv-generator[62709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:30:24 localhost podman[62662]: 2025-12-06 08:30:24.770440622 +0000 UTC m=+0.108065007 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., container_name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:30:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:30:24 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:30:25 localhost systemd[1]: Starting Create netns directory... Dec 6 03:30:25 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 03:30:25 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 03:30:25 localhost systemd[1]: Finished Create netns directory. Dec 6 03:30:25 localhost python3[62736]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 6 03:30:27 localhost python3[62796]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 6 03:30:28 localhost podman[62833]: 2025-12-06 08:30:28.107168056 +0000 UTC m=+0.094478434 container create 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step5) Dec 6 03:30:28 localhost systemd[1]: Started libpod-conmon-3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.scope. Dec 6 03:30:28 localhost podman[62833]: 2025-12-06 08:30:28.062864396 +0000 UTC m=+0.050174824 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:30:28 localhost systemd[1]: Started libcrun container. Dec 6 03:30:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac17782b0a22d6b404309e1e48271f7473fb3586a06782b8c916f0bf1d9c0c8d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:30:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac17782b0a22d6b404309e1e48271f7473fb3586a06782b8c916f0bf1d9c0c8d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 6 03:30:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac17782b0a22d6b404309e1e48271f7473fb3586a06782b8c916f0bf1d9c0c8d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 03:30:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac17782b0a22d6b404309e1e48271f7473fb3586a06782b8c916f0bf1d9c0c8d/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:30:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac17782b0a22d6b404309e1e48271f7473fb3586a06782b8c916f0bf1d9c0c8d/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 6 03:30:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:30:28 localhost podman[62833]: 2025-12-06 08:30:28.222911501 +0000 UTC m=+0.210221919 container init 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:30:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:30:28 localhost podman[62833]: 2025-12-06 08:30:28.260948156 +0000 UTC m=+0.248258534 container start 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute) Dec 6 03:30:28 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:30:28 localhost python3[62796]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:30:28 localhost systemd[1]: Created slice User Slice of UID 0. Dec 6 03:30:28 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 6 03:30:28 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 6 03:30:28 localhost systemd[1]: Starting User Manager for UID 0... Dec 6 03:30:28 localhost podman[62855]: 2025-12-06 08:30:28.379292611 +0000 UTC m=+0.108050096 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, container_name=nova_compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 03:30:28 localhost podman[62855]: 2025-12-06 08:30:28.430337051 +0000 UTC m=+0.159094586 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 03:30:28 localhost podman[62855]: unhealthy Dec 6 03:30:28 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:30:28 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 03:30:28 localhost systemd[62874]: Queued start job for default target Main User Target. Dec 6 03:30:28 localhost systemd[62874]: Created slice User Application Slice. Dec 6 03:30:28 localhost systemd[62874]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 6 03:30:28 localhost systemd[62874]: Started Daily Cleanup of User's Temporary Directories. Dec 6 03:30:28 localhost systemd[62874]: Reached target Paths. Dec 6 03:30:28 localhost systemd[62874]: Reached target Timers. Dec 6 03:30:28 localhost systemd[62874]: Starting D-Bus User Message Bus Socket... Dec 6 03:30:28 localhost systemd[62874]: Starting Create User's Volatile Files and Directories... Dec 6 03:30:28 localhost systemd[62874]: Finished Create User's Volatile Files and Directories. Dec 6 03:30:28 localhost systemd[62874]: Listening on D-Bus User Message Bus Socket. Dec 6 03:30:28 localhost systemd[62874]: Reached target Sockets. Dec 6 03:30:28 localhost systemd[62874]: Reached target Basic System. Dec 6 03:30:28 localhost systemd[62874]: Reached target Main User Target. Dec 6 03:30:28 localhost systemd[62874]: Startup finished in 157ms. Dec 6 03:30:28 localhost systemd[1]: Started User Manager for UID 0. Dec 6 03:30:28 localhost systemd[1]: Started Session c10 of User root. Dec 6 03:30:28 localhost systemd[1]: session-c10.scope: Deactivated successfully. Dec 6 03:30:28 localhost podman[62955]: 2025-12-06 08:30:28.768857635 +0000 UTC m=+0.114814317 container create eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Dec 6 03:30:28 localhost podman[62955]: 2025-12-06 08:30:28.690541415 +0000 UTC m=+0.036498127 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:30:28 localhost systemd[1]: Started libpod-conmon-eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a.scope. Dec 6 03:30:28 localhost systemd[1]: Started libcrun container. Dec 6 03:30:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/135e32b78dae218c3403324b415101180c4222350114b9184c5da39151cfc014/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Dec 6 03:30:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/135e32b78dae218c3403324b415101180c4222350114b9184c5da39151cfc014/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Dec 6 03:30:28 localhost podman[62955]: 2025-12-06 08:30:28.835975315 +0000 UTC m=+0.181931987 container init eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step5, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_wait_for_compute_service, architecture=x86_64, url=https://www.redhat.com) Dec 6 03:30:28 localhost podman[62955]: 2025-12-06 08:30:28.844584053 +0000 UTC m=+0.190540715 container start eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 6 03:30:28 localhost podman[62955]: 2025-12-06 08:30:28.844902973 +0000 UTC m=+0.190859655 container attach eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_wait_for_compute_service, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:30:38 localhost systemd[1]: Stopping User Manager for UID 0... Dec 6 03:30:38 localhost systemd[62874]: Activating special unit Exit the Session... Dec 6 03:30:38 localhost systemd[62874]: Stopped target Main User Target. Dec 6 03:30:38 localhost systemd[62874]: Stopped target Basic System. Dec 6 03:30:38 localhost systemd[62874]: Stopped target Paths. Dec 6 03:30:38 localhost systemd[62874]: Stopped target Sockets. Dec 6 03:30:38 localhost systemd[62874]: Stopped target Timers. Dec 6 03:30:38 localhost systemd[62874]: Stopped Daily Cleanup of User's Temporary Directories. Dec 6 03:30:38 localhost systemd[62874]: Closed D-Bus User Message Bus Socket. Dec 6 03:30:38 localhost systemd[62874]: Stopped Create User's Volatile Files and Directories. Dec 6 03:30:38 localhost systemd[62874]: Removed slice User Application Slice. Dec 6 03:30:38 localhost systemd[62874]: Reached target Shutdown. Dec 6 03:30:38 localhost systemd[62874]: Finished Exit the Session. Dec 6 03:30:38 localhost systemd[62874]: Reached target Exit the Session. Dec 6 03:30:38 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 6 03:30:38 localhost systemd[1]: Stopped User Manager for UID 0. Dec 6 03:30:38 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 6 03:30:38 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 6 03:30:38 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 6 03:30:38 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 6 03:30:38 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 6 03:30:40 localhost systemd[1]: libpod-eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a.scope: Deactivated successfully. Dec 6 03:30:40 localhost podman[62955]: 2025-12-06 08:30:40.940127007 +0000 UTC m=+12.286083689 container died eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, release=1761123044, version=17.1.12, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, maintainer=OpenStack TripleO Team) Dec 6 03:30:41 localhost systemd[1]: tmp-crun.ul1cMU.mount: Deactivated successfully. Dec 6 03:30:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a-userdata-shm.mount: Deactivated successfully. Dec 6 03:30:41 localhost systemd[1]: var-lib-containers-storage-overlay-135e32b78dae218c3403324b415101180c4222350114b9184c5da39151cfc014-merged.mount: Deactivated successfully. Dec 6 03:30:41 localhost podman[63027]: 2025-12-06 08:30:41.028183109 +0000 UTC m=+0.076458472 container cleanup eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_wait_for_compute_service) Dec 6 03:30:41 localhost systemd[1]: libpod-conmon-eadfc3e0b8045d3de282056f8f84a184fc6816e8de1caba5aa2743bcb846f81a.scope: Deactivated successfully. Dec 6 03:30:41 localhost python3[62796]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=f81b1d391c9b63868054d7733e636be7 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/ipa/ca.crt:/etc/ipa/ca.crt:ro --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 6 03:30:41 localhost python3[63081]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:41 localhost python3[63097]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 6 03:30:42 localhost python3[63158]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009841.9638677-148753-48350613655578/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:42 localhost python3[63174]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 03:30:42 localhost systemd[1]: Reloading. Dec 6 03:30:43 localhost systemd-rc-local-generator[63195]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:30:43 localhost systemd-sysv-generator[63202]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:30:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:30:43 localhost python3[63225]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 03:30:43 localhost systemd[1]: Reloading. Dec 6 03:30:44 localhost systemd-rc-local-generator[63251]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:30:44 localhost systemd-sysv-generator[63254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:30:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:30:44 localhost systemd[1]: Starting nova_compute container... Dec 6 03:30:44 localhost tripleo-start-podman-container[63265]: Creating additional drop-in dependency for "nova_compute" (3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6) Dec 6 03:30:44 localhost systemd[1]: Reloading. Dec 6 03:30:44 localhost systemd-sysv-generator[63330]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 03:30:44 localhost systemd-rc-local-generator[63326]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 03:30:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 03:30:44 localhost systemd[1]: Started nova_compute container. Dec 6 03:30:45 localhost python3[63365]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:30:45 localhost podman[63366]: 2025-12-06 08:30:45.55145757 +0000 UTC m=+0.079497488 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=metrics_qdr, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12) Dec 6 03:30:45 localhost podman[63366]: 2025-12-06 08:30:45.69915545 +0000 UTC m=+0.227195318 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:30:45 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:30:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:30:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:30:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:30:46 localhost podman[63503]: 2025-12-06 08:30:46.559983231 +0000 UTC m=+0.085505534 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 6 03:30:46 localhost podman[63503]: 2025-12-06 08:30:46.610218186 +0000 UTC m=+0.135740489 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:30:46 localhost podman[63501]: 2025-12-06 08:30:46.613270221 +0000 UTC m=+0.141566281 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=) Dec 6 03:30:46 localhost podman[63501]: 2025-12-06 08:30:46.654114742 +0000 UTC m=+0.182410792 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, release=1761123044, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:30:46 localhost systemd[1]: tmp-crun.b1c8fI.mount: Deactivated successfully. Dec 6 03:30:46 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:30:46 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:30:46 localhost python3[63557]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005548798 step=5 update_config_hash_only=False Dec 6 03:30:46 localhost podman[63502]: 2025-12-06 08:30:46.665475977 +0000 UTC m=+0.190321049 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044) Dec 6 03:30:46 localhost podman[63502]: 2025-12-06 08:30:46.748200733 +0000 UTC m=+0.273045865 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, release=1761123044) Dec 6 03:30:46 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:30:47 localhost python3[63603]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 03:30:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:30:47 localhost podman[63604]: 2025-12-06 08:30:47.54695367 +0000 UTC m=+0.082503380 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:30:47 localhost python3[63636]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 6 03:30:47 localhost podman[63604]: 2025-12-06 08:30:47.885101143 +0000 UTC m=+0.420650803 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4) Dec 6 03:30:47 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:30:51 localhost systemd[1]: tmp-crun.D24qAZ.mount: Deactivated successfully. Dec 6 03:30:51 localhost podman[63643]: 2025-12-06 08:30:51.567883287 +0000 UTC m=+0.097649513 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 6 03:30:51 localhost podman[63644]: 2025-12-06 08:30:51.608018106 +0000 UTC m=+0.135901464 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller) Dec 6 03:30:51 localhost podman[63644]: 2025-12-06 08:30:51.635282946 +0000 UTC m=+0.163166384 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, release=1761123044) Dec 6 03:30:51 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:30:51 localhost podman[63643]: 2025-12-06 08:30:51.660515292 +0000 UTC m=+0.190281508 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4) Dec 6 03:30:51 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:30:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:30:53 localhost podman[63691]: 2025-12-06 08:30:53.535153758 +0000 UTC m=+0.067382420 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, container_name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container) Dec 6 03:30:53 localhost podman[63691]: 2025-12-06 08:30:53.546193872 +0000 UTC m=+0.078422504 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, distribution-scope=public, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Dec 6 03:30:53 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:30:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:30:55 localhost podman[63711]: 2025-12-06 08:30:55.547825963 +0000 UTC m=+0.083226992 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044) Dec 6 03:30:55 localhost podman[63711]: 2025-12-06 08:30:55.586294092 +0000 UTC m=+0.121695131 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vcs-type=git, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 6 03:30:55 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:30:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:30:58 localhost podman[63730]: 2025-12-06 08:30:58.557889274 +0000 UTC m=+0.090795160 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, release=1761123044, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com) Dec 6 03:30:58 localhost podman[63730]: 2025-12-06 08:30:58.592232263 +0000 UTC m=+0.125138139 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:30:58 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:31:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:31:16 localhost podman[63756]: 2025-12-06 08:31:16.58246589 +0000 UTC m=+0.087818436 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:31:16 localhost podman[63756]: 2025-12-06 08:31:16.770134295 +0000 UTC m=+0.275486821 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:31:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:31:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:31:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:31:16 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:31:16 localhost podman[63786]: 2025-12-06 08:31:16.84895159 +0000 UTC m=+0.047431518 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public) Dec 6 03:31:16 localhost systemd[1]: tmp-crun.gwtqK0.mount: Deactivated successfully. Dec 6 03:31:16 localhost podman[63785]: 2025-12-06 08:31:16.894995983 +0000 UTC m=+0.094675128 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 6 03:31:16 localhost podman[63785]: 2025-12-06 08:31:16.915138022 +0000 UTC m=+0.114817167 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044) Dec 6 03:31:16 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:31:16 localhost podman[63786]: 2025-12-06 08:31:16.946599391 +0000 UTC m=+0.145079339 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:31:16 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:31:16 localhost podman[63784]: 2025-12-06 08:31:16.956496359 +0000 UTC m=+0.157557328 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, config_id=tripleo_step4, container_name=logrotate_crond, tcib_managed=true, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:31:16 localhost podman[63784]: 2025-12-06 08:31:16.984531322 +0000 UTC m=+0.185592311 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:31:16 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:31:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:31:18 localhost podman[63855]: 2025-12-06 08:31:18.538345098 +0000 UTC m=+0.073715427 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:31:18 localhost podman[63855]: 2025-12-06 08:31:18.899111644 +0000 UTC m=+0.434481973 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4) Dec 6 03:31:18 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:31:22 localhost systemd[1]: tmp-crun.W2bliH.mount: Deactivated successfully. Dec 6 03:31:22 localhost podman[63876]: 2025-12-06 08:31:22.518971837 +0000 UTC m=+0.057496483 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:31:22 localhost podman[63876]: 2025-12-06 08:31:22.546390861 +0000 UTC m=+0.084915537 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z) Dec 6 03:31:22 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:31:22 localhost podman[63877]: 2025-12-06 08:31:22.627325192 +0000 UTC m=+0.162719079 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller) Dec 6 03:31:22 localhost systemd[1]: tmp-crun.u1eTod.mount: Deactivated successfully. Dec 6 03:31:22 localhost podman[63877]: 2025-12-06 08:31:22.680332134 +0000 UTC m=+0.215726031 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible) Dec 6 03:31:22 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:31:24 localhost podman[63925]: 2025-12-06 08:31:24.553822924 +0000 UTC m=+0.087915269 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=collectd, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:31:24 localhost podman[63925]: 2025-12-06 08:31:24.567283193 +0000 UTC m=+0.101375538 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1) Dec 6 03:31:24 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:31:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:31:26 localhost podman[63946]: 2025-12-06 08:31:26.556940662 +0000 UTC m=+0.083613264 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:31:26 localhost podman[63946]: 2025-12-06 08:31:26.567620365 +0000 UTC m=+0.094292977 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., container_name=iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vcs-type=git, version=17.1.12) Dec 6 03:31:26 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:31:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:31:29 localhost systemd[1]: tmp-crun.Rrjvni.mount: Deactivated successfully. Dec 6 03:31:29 localhost podman[63966]: 2025-12-06 08:31:29.528789312 +0000 UTC m=+0.066276815 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=) Dec 6 03:31:29 localhost podman[63966]: 2025-12-06 08:31:29.577296564 +0000 UTC m=+0.114784077 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true) Dec 6 03:31:29 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:31:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:31:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:31:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:31:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:31:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:31:47 localhost recover_tripleo_nova_virtqemud[64017]: 51836 Dec 6 03:31:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:31:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:31:47 localhost podman[63994]: 2025-12-06 08:31:47.546103296 +0000 UTC m=+0.074721367 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ceilometer_agent_ipmi, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 6 03:31:47 localhost podman[63994]: 2025-12-06 08:31:47.590915172 +0000 UTC m=+0.119533213 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc.) Dec 6 03:31:47 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:31:47 localhost podman[63995]: 2025-12-06 08:31:47.607888242 +0000 UTC m=+0.132129871 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1761123044, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=) Dec 6 03:31:47 localhost podman[63993]: 2025-12-06 08:31:47.653485163 +0000 UTC m=+0.187966501 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond) Dec 6 03:31:47 localhost podman[63993]: 2025-12-06 08:31:47.659283837 +0000 UTC m=+0.193765185 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:31:47 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:31:47 localhost podman[63995]: 2025-12-06 08:31:47.679153084 +0000 UTC m=+0.203394733 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Dec 6 03:31:47 localhost podman[64001]: 2025-12-06 08:31:47.708367013 +0000 UTC m=+0.230731237 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}) Dec 6 03:31:47 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:31:47 localhost podman[64001]: 2025-12-06 08:31:47.908434015 +0000 UTC m=+0.430798279 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, vcs-type=git, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:31:47 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:31:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:31:49 localhost podman[64093]: 2025-12-06 08:31:49.548677683 +0000 UTC m=+0.078843172 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, url=https://www.redhat.com, tcib_managed=true) Dec 6 03:31:49 localhost podman[64093]: 2025-12-06 08:31:49.906770135 +0000 UTC m=+0.436935624 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:31:49 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:31:53 localhost podman[64116]: 2025-12-06 08:31:53.534796855 +0000 UTC m=+0.068569202 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:31:53 localhost podman[64116]: 2025-12-06 08:31:53.5562845 +0000 UTC m=+0.090056877 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:31:53 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:31:53 localhost podman[64115]: 2025-12-06 08:31:53.647033288 +0000 UTC m=+0.180759714 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:31:53 localhost podman[64115]: 2025-12-06 08:31:53.677332848 +0000 UTC m=+0.211059254 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible) Dec 6 03:31:53 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:31:55 localhost systemd[1]: tmp-crun.y5PNEf.mount: Deactivated successfully. Dec 6 03:31:55 localhost podman[64164]: 2025-12-06 08:31:55.558825816 +0000 UTC m=+0.094600984 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-collectd) Dec 6 03:31:55 localhost podman[64164]: 2025-12-06 08:31:55.569403964 +0000 UTC m=+0.105179152 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64) Dec 6 03:31:55 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:31:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:31:57 localhost podman[64184]: 2025-12-06 08:31:57.557756705 +0000 UTC m=+0.093485540 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=17.1.12, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:31:57 localhost podman[64184]: 2025-12-06 08:31:57.593077926 +0000 UTC m=+0.128806721 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, maintainer=OpenStack TripleO Team) Dec 6 03:31:57 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:32:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:32:00 localhost systemd[1]: tmp-crun.i8PisJ.mount: Deactivated successfully. Dec 6 03:32:00 localhost podman[64203]: 2025-12-06 08:32:00.550934085 +0000 UTC m=+0.086178080 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Dec 6 03:32:00 localhost podman[64203]: 2025-12-06 08:32:00.582368999 +0000 UTC m=+0.117613004 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:32:00 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:32:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:32:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:32:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:32:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:32:18 localhost podman[64230]: 2025-12-06 08:32:18.55914748 +0000 UTC m=+0.086402017 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com) Dec 6 03:32:18 localhost podman[64229]: 2025-12-06 08:32:18.543788728 +0000 UTC m=+0.078570462 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4) Dec 6 03:32:18 localhost podman[64229]: 2025-12-06 08:32:18.62305326 +0000 UTC m=+0.157835084 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, architecture=x86_64, batch=17.1_20251118.1, container_name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Dec 6 03:32:18 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:32:18 localhost podman[64230]: 2025-12-06 08:32:18.634323699 +0000 UTC m=+0.161578226 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 03:32:18 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:32:18 localhost podman[64237]: 2025-12-06 08:32:18.671337961 +0000 UTC m=+0.193731123 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vendor=Red Hat, Inc.) Dec 6 03:32:18 localhost podman[64231]: 2025-12-06 08:32:18.723089367 +0000 UTC m=+0.248642294 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:11:48Z, distribution-scope=public, architecture=x86_64) Dec 6 03:32:18 localhost podman[64231]: 2025-12-06 08:32:18.750296055 +0000 UTC m=+0.275849012 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:32:18 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:32:18 localhost podman[64237]: 2025-12-06 08:32:18.908391446 +0000 UTC m=+0.430784628 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 6 03:32:18 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:32:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:32:20 localhost systemd[1]: tmp-crun.nvRXNj.mount: Deactivated successfully. Dec 6 03:32:20 localhost podman[64332]: 2025-12-06 08:32:20.546114439 +0000 UTC m=+0.081507340 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com) Dec 6 03:32:20 localhost podman[64332]: 2025-12-06 08:32:20.931276224 +0000 UTC m=+0.466669085 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64) Dec 6 03:32:20 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:32:24 localhost systemd[1]: tmp-crun.tlUdEN.mount: Deactivated successfully. Dec 6 03:32:24 localhost podman[64357]: 2025-12-06 08:32:24.588951166 +0000 UTC m=+0.116178683 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ovn_controller, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:32:24 localhost podman[64356]: 2025-12-06 08:32:24.605504474 +0000 UTC m=+0.136225966 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:14:25Z, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:32:24 localhost podman[64357]: 2025-12-06 08:32:24.617266917 +0000 UTC m=+0.144494474 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_controller, managed_by=tripleo_ansible) Dec 6 03:32:24 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:32:24 localhost podman[64356]: 2025-12-06 08:32:24.66958449 +0000 UTC m=+0.200305932 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-type=git, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:32:24 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:32:26 localhost podman[64402]: 2025-12-06 08:32:26.560279453 +0000 UTC m=+0.095477650 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:32:26 localhost podman[64402]: 2025-12-06 08:32:26.597375109 +0000 UTC m=+0.132573276 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 6 03:32:26 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:32:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:32:28 localhost podman[64422]: 2025-12-06 08:32:28.548762347 +0000 UTC m=+0.084386127 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, release=1761123044, build-date=2025-11-18T23:44:13Z) Dec 6 03:32:28 localhost podman[64422]: 2025-12-06 08:32:28.563274383 +0000 UTC m=+0.098898133 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=) Dec 6 03:32:28 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:32:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:32:31 localhost podman[64442]: 2025-12-06 08:32:31.549272397 +0000 UTC m=+0.080041485 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, architecture=x86_64) Dec 6 03:32:31 localhost podman[64442]: 2025-12-06 08:32:31.570545437 +0000 UTC m=+0.101314505 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true) Dec 6 03:32:31 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:32:49 localhost systemd[1]: tmp-crun.04kJBR.mount: Deactivated successfully. Dec 6 03:32:49 localhost systemd[1]: tmp-crun.53ILQ2.mount: Deactivated successfully. Dec 6 03:32:49 localhost podman[64470]: 2025-12-06 08:32:49.559011459 +0000 UTC m=+0.080133689 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:32:49 localhost podman[64470]: 2025-12-06 08:32:49.582115853 +0000 UTC m=+0.103238123 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Dec 6 03:32:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:32:49 localhost podman[64476]: 2025-12-06 08:32:49.615659391 +0000 UTC m=+0.134789572 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, config_id=tripleo_step1, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:32:49 localhost podman[64468]: 2025-12-06 08:32:49.531402899 +0000 UTC m=+0.062111367 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, container_name=logrotate_crond, tcib_managed=true, build-date=2025-11-18T22:49:32Z) Dec 6 03:32:49 localhost podman[64468]: 2025-12-06 08:32:49.661393437 +0000 UTC m=+0.192101845 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Dec 6 03:32:49 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:32:49 localhost podman[64469]: 2025-12-06 08:32:49.707063889 +0000 UTC m=+0.231199879 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, vcs-type=git, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 6 03:32:49 localhost podman[64469]: 2025-12-06 08:32:49.730234445 +0000 UTC m=+0.254370465 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:32:49 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:32:49 localhost podman[64476]: 2025-12-06 08:32:49.810293922 +0000 UTC m=+0.329424153 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:32:49 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:32:50 localhost systemd[1]: tmp-crun.QIJQ9E.mount: Deactivated successfully. Dec 6 03:32:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:32:51 localhost podman[64567]: 2025-12-06 08:32:51.547123942 +0000 UTC m=+0.082330335 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:36:58Z, release=1761123044, com.redhat.component=openstack-nova-compute-container) Dec 6 03:32:51 localhost podman[64567]: 2025-12-06 08:32:51.903223364 +0000 UTC m=+0.438429677 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4) Dec 6 03:32:51 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:32:55 localhost systemd[1]: tmp-crun.wLvJf8.mount: Deactivated successfully. Dec 6 03:32:55 localhost podman[64591]: 2025-12-06 08:32:55.546157502 +0000 UTC m=+0.080606203 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=) Dec 6 03:32:55 localhost podman[64592]: 2025-12-06 08:32:55.605165815 +0000 UTC m=+0.133753760 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:32:55 localhost podman[64591]: 2025-12-06 08:32:55.631072615 +0000 UTC m=+0.165521246 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 03:32:55 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:32:55 localhost podman[64592]: 2025-12-06 08:32:55.656394106 +0000 UTC m=+0.184982011 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public) Dec 6 03:32:55 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:32:57 localhost podman[64639]: 2025-12-06 08:32:57.539495403 +0000 UTC m=+0.075939422 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:32:57 localhost podman[64639]: 2025-12-06 08:32:57.573404091 +0000 UTC m=+0.109848080 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Dec 6 03:32:57 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:32:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:32:59 localhost podman[64658]: 2025-12-06 08:32:59.547155843 +0000 UTC m=+0.077732218 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:32:59 localhost podman[64658]: 2025-12-06 08:32:59.562264606 +0000 UTC m=+0.092841021 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1) Dec 6 03:32:59 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:33:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:33:02 localhost podman[64677]: 2025-12-06 08:33:02.549151907 +0000 UTC m=+0.081695426 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:33:02 localhost podman[64677]: 2025-12-06 08:33:02.594566571 +0000 UTC m=+0.127110110 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, container_name=nova_compute, release=1761123044, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:33:02 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:33:02 localhost sshd[64705]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:33:02 localhost systemd-logind[760]: New session 20 of user zuul. Dec 6 03:33:02 localhost systemd[1]: Started Session 20 of User zuul. Dec 6 03:33:03 localhost python3[64814]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 03:33:11 localhost python3[65073]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Dec 6 03:33:18 localhost python3[65166]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Dec 6 03:33:18 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Dec 6 03:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:33:20 localhost podman[65189]: 2025-12-06 08:33:20.547959091 +0000 UTC m=+0.081882151 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:33:20 localhost podman[65189]: 2025-12-06 08:33:20.603990925 +0000 UTC m=+0.137913985 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:33:20 localhost systemd[1]: tmp-crun.gJhEzl.mount: Deactivated successfully. Dec 6 03:33:20 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:33:20 localhost systemd[1]: tmp-crun.gZfld4.mount: Deactivated successfully. Dec 6 03:33:20 localhost podman[65188]: 2025-12-06 08:33:20.607676826 +0000 UTC m=+0.141030229 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1761123044) Dec 6 03:33:20 localhost podman[65191]: 2025-12-06 08:33:20.658107042 +0000 UTC m=+0.183825416 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, vcs-type=git, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team) Dec 6 03:33:20 localhost podman[65188]: 2025-12-06 08:33:20.73890706 +0000 UTC m=+0.272260413 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:33:20 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:33:20 localhost podman[65190]: 2025-12-06 08:33:20.708916819 +0000 UTC m=+0.238277472 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git) Dec 6 03:33:20 localhost podman[65190]: 2025-12-06 08:33:20.795387528 +0000 UTC m=+0.324748131 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12) Dec 6 03:33:20 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:33:20 localhost podman[65191]: 2025-12-06 08:33:20.891192627 +0000 UTC m=+0.416911051 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, vcs-type=git, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Dec 6 03:33:20 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:33:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:33:22 localhost podman[65290]: 2025-12-06 08:33:22.546732795 +0000 UTC m=+0.081160299 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, release=1761123044, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:33:22 localhost podman[65290]: 2025-12-06 08:33:22.903576569 +0000 UTC m=+0.438004143 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:33:22 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:33:22 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:33:22 localhost recover_tripleo_nova_virtqemud[65314]: 51836 Dec 6 03:33:22 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:33:22 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:33:26 localhost podman[65316]: 2025-12-06 08:33:26.565428636 +0000 UTC m=+0.090283515 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:33:26 localhost systemd[1]: tmp-crun.6eCRmM.mount: Deactivated successfully. Dec 6 03:33:26 localhost podman[65316]: 2025-12-06 08:33:26.61281476 +0000 UTC m=+0.137669699 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 6 03:33:26 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:33:26 localhost podman[65315]: 2025-12-06 08:33:26.614009527 +0000 UTC m=+0.139593417 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible) Dec 6 03:33:26 localhost podman[65315]: 2025-12-06 08:33:26.697314239 +0000 UTC m=+0.222898079 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=) Dec 6 03:33:26 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:33:28 localhost podman[65361]: 2025-12-06 08:33:28.543624361 +0000 UTC m=+0.079161599 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Dec 6 03:33:28 localhost podman[65361]: 2025-12-06 08:33:28.551726464 +0000 UTC m=+0.087263692 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:33:28 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:33:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:33:30 localhost podman[65381]: 2025-12-06 08:33:30.543625641 +0000 UTC m=+0.074893282 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T23:44:13Z, tcib_managed=true, name=rhosp17/openstack-iscsid) Dec 6 03:33:30 localhost podman[65381]: 2025-12-06 08:33:30.556312613 +0000 UTC m=+0.087580214 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, tcib_managed=true, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team) Dec 6 03:33:30 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:33:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:33:33 localhost podman[65399]: 2025-12-06 08:33:33.542786551 +0000 UTC m=+0.081138668 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044) Dec 6 03:33:33 localhost podman[65399]: 2025-12-06 08:33:33.598341371 +0000 UTC m=+0.136693458 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12) Dec 6 03:33:33 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:33:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:33:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:33:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:33:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:33:51 localhost podman[65428]: 2025-12-06 08:33:51.575339819 +0000 UTC m=+0.088795140 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.openshift.expose-services=) Dec 6 03:33:51 localhost podman[65426]: 2025-12-06 08:33:51.620547567 +0000 UTC m=+0.142171324 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, version=17.1.12, container_name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, distribution-scope=public) Dec 6 03:33:51 localhost podman[65428]: 2025-12-06 08:33:51.634316455 +0000 UTC m=+0.147771806 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_compute, release=1761123044, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team) Dec 6 03:33:51 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:33:51 localhost podman[65426]: 2025-12-06 08:33:51.708687558 +0000 UTC m=+0.230311325 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 6 03:33:51 localhost podman[65427]: 2025-12-06 08:33:51.719970968 +0000 UTC m=+0.234478120 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12) Dec 6 03:33:51 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:33:51 localhost podman[65434]: 2025-12-06 08:33:51.691452521 +0000 UTC m=+0.201073496 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 6 03:33:51 localhost podman[65427]: 2025-12-06 08:33:51.749177416 +0000 UTC m=+0.263684628 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, container_name=ceilometer_agent_ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public) Dec 6 03:33:51 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:33:51 localhost podman[65434]: 2025-12-06 08:33:51.941286288 +0000 UTC m=+0.450907283 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=metrics_qdr, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd) Dec 6 03:33:51 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:33:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:33:53 localhost podman[65529]: 2025-12-06 08:33:53.556235372 +0000 UTC m=+0.090235616 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:33:53 localhost podman[65529]: 2025-12-06 08:33:53.884206008 +0000 UTC m=+0.418206182 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, architecture=x86_64, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container) Dec 6 03:33:53 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:33:57 localhost systemd[1]: tmp-crun.IsvqJX.mount: Deactivated successfully. Dec 6 03:33:57 localhost podman[65552]: 2025-12-06 08:33:57.542450783 +0000 UTC m=+0.073417105 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, managed_by=tripleo_ansible) Dec 6 03:33:57 localhost podman[65552]: 2025-12-06 08:33:57.609234639 +0000 UTC m=+0.140200981 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044) Dec 6 03:33:57 localhost systemd[1]: tmp-crun.p0JfSM.mount: Deactivated successfully. Dec 6 03:33:57 localhost podman[65553]: 2025-12-06 08:33:57.620661455 +0000 UTC m=+0.147668164 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:34:05Z) Dec 6 03:33:57 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:33:57 localhost podman[65553]: 2025-12-06 08:33:57.642111632 +0000 UTC m=+0.169118341 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:33:57 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:33:59 localhost podman[65597]: 2025-12-06 08:33:59.554590605 +0000 UTC m=+0.089405841 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:33:59 localhost podman[65597]: 2025-12-06 08:33:59.568237739 +0000 UTC m=+0.103053025 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-type=git) Dec 6 03:33:59 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:34:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:34:01 localhost podman[65617]: 2025-12-06 08:34:01.543785323 +0000 UTC m=+0.077183960 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:34:01 localhost podman[65617]: 2025-12-06 08:34:01.556318623 +0000 UTC m=+0.089717220 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, container_name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1) Dec 6 03:34:01 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:34:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:34:04 localhost podman[65636]: 2025-12-06 08:34:04.542336861 +0000 UTC m=+0.079129723 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:34:04 localhost podman[65636]: 2025-12-06 08:34:04.570328011 +0000 UTC m=+0.107120853 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:34:04 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:34:17 localhost systemd[1]: session-20.scope: Deactivated successfully. Dec 6 03:34:17 localhost systemd[1]: session-20.scope: Consumed 5.752s CPU time. Dec 6 03:34:17 localhost systemd-logind[760]: Session 20 logged out. Waiting for processes to exit. Dec 6 03:34:17 localhost systemd-logind[760]: Removed session 20. Dec 6 03:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:34:22 localhost podman[65661]: 2025-12-06 08:34:22.554771868 +0000 UTC m=+0.090632910 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=) Dec 6 03:34:22 localhost systemd[1]: tmp-crun.hZm2zt.mount: Deactivated successfully. Dec 6 03:34:22 localhost podman[65669]: 2025-12-06 08:34:22.624916919 +0000 UTC m=+0.145282118 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, name=rhosp17/openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044) Dec 6 03:34:22 localhost podman[65661]: 2025-12-06 08:34:22.64198668 +0000 UTC m=+0.177847712 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container) Dec 6 03:34:22 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:34:22 localhost podman[65662]: 2025-12-06 08:34:22.705818943 +0000 UTC m=+0.235121631 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:34:22 localhost podman[65662]: 2025-12-06 08:34:22.733297048 +0000 UTC m=+0.262599796 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:34:22 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:34:22 localhost podman[65666]: 2025-12-06 08:34:22.824004028 +0000 UTC m=+0.347256187 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-type=git, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Dec 6 03:34:22 localhost podman[65666]: 2025-12-06 08:34:22.851197573 +0000 UTC m=+0.374449752 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4) Dec 6 03:34:22 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:34:22 localhost podman[65669]: 2025-12-06 08:34:22.896576294 +0000 UTC m=+0.416941493 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:34:22 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:34:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:34:24 localhost podman[65764]: 2025-12-06 08:34:24.536977469 +0000 UTC m=+0.073596788 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container) Dec 6 03:34:24 localhost podman[65764]: 2025-12-06 08:34:24.920528433 +0000 UTC m=+0.457147742 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, url=https://www.redhat.com) Dec 6 03:34:24 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:34:26 localhost sshd[65785]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:34:26 localhost systemd-logind[760]: New session 21 of user zuul. Dec 6 03:34:26 localhost systemd[1]: Started Session 21 of User zuul. Dec 6 03:34:26 localhost python3[65805]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:34:28 localhost podman[65807]: 2025-12-06 08:34:28.539807257 +0000 UTC m=+0.068861112 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, vcs-type=git, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible) Dec 6 03:34:28 localhost podman[65808]: 2025-12-06 08:34:28.597866162 +0000 UTC m=+0.124116060 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, maintainer=OpenStack TripleO Team) Dec 6 03:34:28 localhost podman[65807]: 2025-12-06 08:34:28.631730085 +0000 UTC m=+0.160783970 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 6 03:34:28 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:34:28 localhost podman[65808]: 2025-12-06 08:34:28.647577757 +0000 UTC m=+0.173827655 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:34:28 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:34:30 localhost podman[65855]: 2025-12-06 08:34:30.550582767 +0000 UTC m=+0.077698606 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, vcs-type=git, tcib_managed=true, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:34:30 localhost podman[65855]: 2025-12-06 08:34:30.564274443 +0000 UTC m=+0.091390282 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4) Dec 6 03:34:31 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:34:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:34:32 localhost podman[65876]: 2025-12-06 08:34:32.547422923 +0000 UTC m=+0.081085971 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, container_name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4) Dec 6 03:34:32 localhost podman[65876]: 2025-12-06 08:34:32.557257909 +0000 UTC m=+0.090920937 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git) Dec 6 03:34:32 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:34:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:34:35 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:34:35 localhost recover_tripleo_nova_virtqemud[65904]: 51836 Dec 6 03:34:35 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:34:35 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:34:35 localhost podman[65897]: 2025-12-06 08:34:35.549035555 +0000 UTC m=+0.080981879 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc.) Dec 6 03:34:35 localhost podman[65897]: 2025-12-06 08:34:35.579358948 +0000 UTC m=+0.111305232 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:34:35 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:34:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:34:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:34:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:34:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:34:53 localhost podman[65925]: 2025-12-06 08:34:53.577481321 +0000 UTC m=+0.099450512 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12) Dec 6 03:34:53 localhost podman[65925]: 2025-12-06 08:34:53.610028503 +0000 UTC m=+0.131997734 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:34:53 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:34:53 localhost podman[65928]: 2025-12-06 08:34:53.629111767 +0000 UTC m=+0.142555853 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Dec 6 03:34:53 localhost podman[65926]: 2025-12-06 08:34:53.686302444 +0000 UTC m=+0.206073946 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:34:53 localhost podman[65926]: 2025-12-06 08:34:53.714329856 +0000 UTC m=+0.234101348 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, release=1761123044, distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi) Dec 6 03:34:53 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:34:53 localhost podman[65927]: 2025-12-06 08:34:53.731305213 +0000 UTC m=+0.248384832 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 03:34:53 localhost podman[65927]: 2025-12-06 08:34:53.787258733 +0000 UTC m=+0.304338352 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Dec 6 03:34:53 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:34:53 localhost podman[65928]: 2025-12-06 08:34:53.836202215 +0000 UTC m=+0.349646311 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Dec 6 03:34:53 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:34:54 localhost systemd[1]: tmp-crun.dRq62U.mount: Deactivated successfully. Dec 6 03:34:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:34:55 localhost systemd[1]: tmp-crun.TmScGk.mount: Deactivated successfully. Dec 6 03:34:55 localhost podman[66025]: 2025-12-06 08:34:55.530127055 +0000 UTC m=+0.066761727 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, container_name=nova_migration_target, io.openshift.expose-services=) Dec 6 03:34:55 localhost podman[66025]: 2025-12-06 08:34:55.926207298 +0000 UTC m=+0.462841970 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:34:55 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:34:57 localhost python3[66063]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 6 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:34:59 localhost podman[66065]: 2025-12-06 08:34:59.566099482 +0000 UTC m=+0.091225157 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git) Dec 6 03:34:59 localhost systemd[1]: tmp-crun.KRwnAi.mount: Deactivated successfully. Dec 6 03:34:59 localhost podman[66066]: 2025-12-06 08:34:59.621126153 +0000 UTC m=+0.144665669 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z) Dec 6 03:34:59 localhost podman[66066]: 2025-12-06 08:34:59.64611029 +0000 UTC m=+0.169649836 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:34:59 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:34:59 localhost podman[66065]: 2025-12-06 08:34:59.696669251 +0000 UTC m=+0.221794946 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true) Dec 6 03:34:59 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:35:01 localhost podman[66113]: 2025-12-06 08:35:01.551818122 +0000 UTC m=+0.081278257 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, name=rhosp17/openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:35:01 localhost podman[66113]: 2025-12-06 08:35:01.564545169 +0000 UTC m=+0.094005284 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:35:01 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:35:01 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:35:01 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 03:35:01 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 03:35:02 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 03:35:02 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 03:35:02 localhost systemd[1]: run-r07a23deee8d7438881458af5a50a7dba.service: Deactivated successfully. Dec 6 03:35:02 localhost systemd[1]: run-r6feca87fa58b47c79595a9b61ac6ebbc.service: Deactivated successfully. Dec 6 03:35:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:35:03 localhost podman[66280]: 2025-12-06 08:35:03.550015171 +0000 UTC m=+0.081395952 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible) Dec 6 03:35:03 localhost podman[66280]: 2025-12-06 08:35:03.564129359 +0000 UTC m=+0.095510150 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Dec 6 03:35:03 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:35:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:35:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:35:06 localhost recover_tripleo_nova_virtqemud[66301]: 51836 Dec 6 03:35:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:35:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:35:06 localhost podman[66299]: 2025-12-06 08:35:06.540055704 +0000 UTC m=+0.074506548 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, version=17.1.12, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:35:06 localhost podman[66299]: 2025-12-06 08:35:06.588685245 +0000 UTC m=+0.123136119 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Dec 6 03:35:06 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:35:24 localhost podman[66328]: 2025-12-06 08:35:24.583113783 +0000 UTC m=+0.106099799 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:35:24 localhost podman[66328]: 2025-12-06 08:35:24.610123233 +0000 UTC m=+0.133109259 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64) Dec 6 03:35:24 localhost systemd[1]: tmp-crun.cA74xT.mount: Deactivated successfully. Dec 6 03:35:24 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:35:24 localhost podman[66327]: 2025-12-06 08:35:24.676160266 +0000 UTC m=+0.201637210 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:35:24 localhost podman[66327]: 2025-12-06 08:35:24.684379032 +0000 UTC m=+0.209855956 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Dec 6 03:35:24 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:35:24 localhost podman[66330]: 2025-12-06 08:35:24.645627107 +0000 UTC m=+0.163462664 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:35:24 localhost podman[66329]: 2025-12-06 08:35:24.775971438 +0000 UTC m=+0.295419104 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com) Dec 6 03:35:24 localhost podman[66329]: 2025-12-06 08:35:24.827101368 +0000 UTC m=+0.346549054 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4) Dec 6 03:35:24 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:35:24 localhost podman[66330]: 2025-12-06 08:35:24.861017763 +0000 UTC m=+0.378853360 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:35:24 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:35:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:35:26 localhost podman[66427]: 2025-12-06 08:35:26.545242931 +0000 UTC m=+0.076569862 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, release=1761123044, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 6 03:35:26 localhost podman[66427]: 2025-12-06 08:35:26.915331135 +0000 UTC m=+0.446658087 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, version=17.1.12, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Dec 6 03:35:26 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:35:30 localhost systemd[1]: tmp-crun.ztucg2.mount: Deactivated successfully. Dec 6 03:35:30 localhost podman[66449]: 2025-12-06 08:35:30.547835661 +0000 UTC m=+0.083895260 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 03:35:30 localhost podman[66450]: 2025-12-06 08:35:30.605484642 +0000 UTC m=+0.136911137 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com) Dec 6 03:35:30 localhost podman[66449]: 2025-12-06 08:35:30.616966549 +0000 UTC m=+0.153026148 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, container_name=ovn_metadata_agent, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Dec 6 03:35:30 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:35:30 localhost podman[66450]: 2025-12-06 08:35:30.631265793 +0000 UTC m=+0.162692238 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ovn-controller) Dec 6 03:35:30 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:35:32 localhost podman[66498]: 2025-12-06 08:35:32.548259448 +0000 UTC m=+0.083273810 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64, distribution-scope=public) Dec 6 03:35:32 localhost podman[66498]: 2025-12-06 08:35:32.561586902 +0000 UTC m=+0.096601234 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:51:28Z, container_name=collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-collectd) Dec 6 03:35:32 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:35:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:35:34 localhost systemd[1]: tmp-crun.gTMEXW.mount: Deactivated successfully. Dec 6 03:35:34 localhost podman[66519]: 2025-12-06 08:35:34.550910666 +0000 UTC m=+0.086326425 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, config_id=tripleo_step3, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, distribution-scope=public, container_name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:35:34 localhost podman[66519]: 2025-12-06 08:35:34.58935285 +0000 UTC m=+0.124768629 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, distribution-scope=public, container_name=iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:35:34 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:35:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:35:37 localhost podman[66539]: 2025-12-06 08:35:37.549484643 +0000 UTC m=+0.084898310 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step5, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 6 03:35:37 localhost podman[66539]: 2025-12-06 08:35:37.611526281 +0000 UTC m=+0.146939898 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:35:37 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:35:50 localhost python3[66581]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 03:35:53 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 03:35:53 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 6 03:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:35:55 localhost systemd[1]: tmp-crun.aTOop0.mount: Deactivated successfully. Dec 6 03:35:55 localhost podman[66712]: 2025-12-06 08:35:55.540096601 +0000 UTC m=+0.071582907 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64) Dec 6 03:35:55 localhost podman[66712]: 2025-12-06 08:35:55.54650768 +0000 UTC m=+0.077993976 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-cron) Dec 6 03:35:55 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:35:55 localhost podman[66713]: 2025-12-06 08:35:55.554810808 +0000 UTC m=+0.084134476 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 6 03:35:55 localhost podman[66725]: 2025-12-06 08:35:55.60858409 +0000 UTC m=+0.128256218 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}) Dec 6 03:35:55 localhost podman[66713]: 2025-12-06 08:35:55.687268216 +0000 UTC m=+0.216591984 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Dec 6 03:35:55 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:35:55 localhost podman[66714]: 2025-12-06 08:35:55.691185638 +0000 UTC m=+0.213805517 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:35:55 localhost podman[66714]: 2025-12-06 08:35:55.771409963 +0000 UTC m=+0.294029832 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1) Dec 6 03:35:55 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:35:55 localhost podman[66725]: 2025-12-06 08:35:55.791078134 +0000 UTC m=+0.310750292 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4) Dec 6 03:35:55 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:35:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:35:57 localhost podman[66867]: 2025-12-06 08:35:57.52574545 +0000 UTC m=+0.064327681 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com) Dec 6 03:35:57 localhost podman[66867]: 2025-12-06 08:35:57.899269351 +0000 UTC m=+0.437851592 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Dec 6 03:35:57 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:36:01 localhost podman[66891]: 2025-12-06 08:36:01.544116988 +0000 UTC m=+0.073561470 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.expose-services=) Dec 6 03:36:01 localhost podman[66891]: 2025-12-06 08:36:01.617362077 +0000 UTC m=+0.146806619 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, vcs-type=git, build-date=2025-11-19T00:14:25Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:36:01 localhost podman[66892]: 2025-12-06 08:36:01.617665756 +0000 UTC m=+0.139926285 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:36:01 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:36:01 localhost podman[66892]: 2025-12-06 08:36:01.703398422 +0000 UTC m=+0.225648161 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:36:01 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:36:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:36:03 localhost systemd[1]: tmp-crun.frCw8z.mount: Deactivated successfully. Dec 6 03:36:03 localhost podman[66940]: 2025-12-06 08:36:03.540479824 +0000 UTC m=+0.076686608 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:51:28Z, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true) Dec 6 03:36:03 localhost podman[66940]: 2025-12-06 08:36:03.551491625 +0000 UTC m=+0.087698389 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-collectd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:36:03 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:36:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:36:05 localhost podman[66960]: 2025-12-06 08:36:05.554731833 +0000 UTC m=+0.088048599 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:36:05 localhost podman[66960]: 2025-12-06 08:36:05.592177462 +0000 UTC m=+0.125494208 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:36:05 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:36:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:36:08 localhost podman[66979]: 2025-12-06 08:36:08.546139809 +0000 UTC m=+0.082317182 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:36:08 localhost podman[66979]: 2025-12-06 08:36:08.601287256 +0000 UTC m=+0.137464649 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:36:08 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:36:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:36:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:36:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:36:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:36:26 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:36:26 localhost recover_tripleo_nova_virtqemud[67031]: 51836 Dec 6 03:36:26 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:36:26 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:36:26 localhost podman[67007]: 2025-12-06 08:36:26.531727245 +0000 UTC m=+0.065787690 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4) Dec 6 03:36:26 localhost systemd[1]: tmp-crun.m9Y7R2.mount: Deactivated successfully. Dec 6 03:36:26 localhost podman[67005]: 2025-12-06 08:36:26.549544636 +0000 UTC m=+0.083906861 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:36:26 localhost podman[67007]: 2025-12-06 08:36:26.560109603 +0000 UTC m=+0.094170038 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:36:26 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:36:26 localhost podman[67006]: 2025-12-06 08:36:26.607373778 +0000 UTC m=+0.137320345 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 03:36:26 localhost podman[67005]: 2025-12-06 08:36:26.611134464 +0000 UTC m=+0.145496699 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=) Dec 6 03:36:26 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:36:26 localhost podman[67006]: 2025-12-06 08:36:26.634173647 +0000 UTC m=+0.164120154 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:36:26 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:36:26 localhost podman[67008]: 2025-12-06 08:36:26.689235993 +0000 UTC m=+0.217630193 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:36:26 localhost podman[67008]: 2025-12-06 08:36:26.888196466 +0000 UTC m=+0.416590696 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=) Dec 6 03:36:26 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:36:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:36:28 localhost systemd[1]: tmp-crun.cRDasm.mount: Deactivated successfully. Dec 6 03:36:28 localhost podman[67109]: 2025-12-06 08:36:28.551395932 +0000 UTC m=+0.077878934 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:36:28 localhost podman[67109]: 2025-12-06 08:36:28.949219694 +0000 UTC m=+0.475702656 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:36:28 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:36:30 localhost python3[67145]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Dec 6 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:36:32 localhost podman[67146]: 2025-12-06 08:36:32.553319296 +0000 UTC m=+0.083826107 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 6 03:36:32 localhost podman[67147]: 2025-12-06 08:36:32.612373665 +0000 UTC m=+0.139100038 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=) Dec 6 03:36:32 localhost podman[67146]: 2025-12-06 08:36:32.619521977 +0000 UTC m=+0.150028778 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:36:32 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:36:32 localhost podman[67147]: 2025-12-06 08:36:32.66350839 +0000 UTC m=+0.190234743 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:36:32 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:36:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:36:34 localhost podman[67193]: 2025-12-06 08:36:34.534359507 +0000 UTC m=+0.070072691 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 6 03:36:34 localhost podman[67193]: 2025-12-06 08:36:34.570292549 +0000 UTC m=+0.106005723 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12) Dec 6 03:36:34 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:36:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:36:36 localhost podman[67211]: 2025-12-06 08:36:36.554082796 +0000 UTC m=+0.088942566 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, url=https://www.redhat.com, container_name=iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:36:36 localhost podman[67211]: 2025-12-06 08:36:36.591245417 +0000 UTC m=+0.126105117 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, distribution-scope=public, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container) Dec 6 03:36:36 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:36:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:36:39 localhost podman[67231]: 2025-12-06 08:36:39.524519402 +0000 UTC m=+0.062358852 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public) Dec 6 03:36:39 localhost systemd[1]: tmp-crun.xzn3eY.mount: Deactivated successfully. Dec 6 03:36:39 localhost podman[67231]: 2025-12-06 08:36:39.579664981 +0000 UTC m=+0.117504461 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5) Dec 6 03:36:39 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:36:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:36:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:36:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:36:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:36:57 localhost podman[67258]: 2025-12-06 08:36:57.566762702 +0000 UTC m=+0.090601827 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044) Dec 6 03:36:57 localhost systemd[1]: tmp-crun.RVwYU8.mount: Deactivated successfully. Dec 6 03:36:57 localhost podman[67257]: 2025-12-06 08:36:57.663477298 +0000 UTC m=+0.190041128 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 03:36:57 localhost podman[67257]: 2025-12-06 08:36:57.672834388 +0000 UTC m=+0.199398268 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, vcs-type=git, tcib_managed=true, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:36:57 localhost podman[67259]: 2025-12-06 08:36:57.630194187 +0000 UTC m=+0.150476082 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:36:57 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:36:57 localhost podman[67259]: 2025-12-06 08:36:57.710242367 +0000 UTC m=+0.230524322 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute) Dec 6 03:36:57 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:36:57 localhost podman[67258]: 2025-12-06 08:36:57.726991016 +0000 UTC m=+0.250830201 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 03:36:57 localhost podman[67260]: 2025-12-06 08:36:57.72553292 +0000 UTC m=+0.242270326 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, release=1761123044, config_id=tripleo_step1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Dec 6 03:36:57 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:36:57 localhost podman[67260]: 2025-12-06 08:36:57.917565729 +0000 UTC m=+0.434303115 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Dec 6 03:36:57 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:36:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:36:59 localhost podman[67355]: 2025-12-06 08:36:59.551005513 +0000 UTC m=+0.082868779 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1) Dec 6 03:36:59 localhost podman[67355]: 2025-12-06 08:36:59.922514159 +0000 UTC m=+0.454377385 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.12, container_name=nova_migration_target, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:36:59 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:37:03 localhost podman[67379]: 2025-12-06 08:37:03.553930609 +0000 UTC m=+0.078600388 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc.) Dec 6 03:37:03 localhost systemd[1]: tmp-crun.63SzUm.mount: Deactivated successfully. Dec 6 03:37:03 localhost podman[67378]: 2025-12-06 08:37:03.608659584 +0000 UTC m=+0.135465228 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:37:03 localhost podman[67379]: 2025-12-06 08:37:03.624257358 +0000 UTC m=+0.148927137 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, vcs-type=git) Dec 6 03:37:03 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:37:03 localhost podman[67378]: 2025-12-06 08:37:03.676750503 +0000 UTC m=+0.203556097 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4) Dec 6 03:37:03 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:37:05 localhost podman[67428]: 2025-12-06 08:37:05.554203515 +0000 UTC m=+0.086408779 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64) Dec 6 03:37:05 localhost podman[67428]: 2025-12-06 08:37:05.564585796 +0000 UTC m=+0.096791040 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Dec 6 03:37:05 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:37:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:37:07 localhost podman[67448]: 2025-12-06 08:37:07.54826345 +0000 UTC m=+0.083364742 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044) Dec 6 03:37:07 localhost podman[67448]: 2025-12-06 08:37:07.582271003 +0000 UTC m=+0.117372255 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Dec 6 03:37:07 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:37:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:37:10 localhost podman[67468]: 2025-12-06 08:37:10.552462382 +0000 UTC m=+0.087207892 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:37:10 localhost podman[67468]: 2025-12-06 08:37:10.606538517 +0000 UTC m=+0.141284037 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5) Dec 6 03:37:10 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:37:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:37:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:37:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:37:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:37:28 localhost podman[67494]: 2025-12-06 08:37:28.553553719 +0000 UTC m=+0.086364237 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, container_name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Dec 6 03:37:28 localhost podman[67494]: 2025-12-06 08:37:28.563130985 +0000 UTC m=+0.095941503 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:37:28 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:37:28 localhost podman[67502]: 2025-12-06 08:37:28.609887594 +0000 UTC m=+0.127655556 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr) Dec 6 03:37:28 localhost systemd[1]: tmp-crun.zZQt9T.mount: Deactivated successfully. Dec 6 03:37:28 localhost podman[67495]: 2025-12-06 08:37:28.660896554 +0000 UTC m=+0.187087807 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4) Dec 6 03:37:28 localhost podman[67501]: 2025-12-06 08:37:28.713693989 +0000 UTC m=+0.235002781 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:37:28 localhost podman[67495]: 2025-12-06 08:37:28.756222626 +0000 UTC m=+0.282413949 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible) Dec 6 03:37:28 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:37:28 localhost podman[67501]: 2025-12-06 08:37:28.783261734 +0000 UTC m=+0.304570526 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:37:28 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:37:28 localhost podman[67502]: 2025-12-06 08:37:28.840455645 +0000 UTC m=+0.358223647 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public) Dec 6 03:37:28 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:37:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:37:30 localhost podman[67594]: 2025-12-06 08:37:30.739762814 +0000 UTC m=+0.072880789 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:37:30 localhost systemd[1]: session-21.scope: Deactivated successfully. Dec 6 03:37:30 localhost systemd[1]: session-21.scope: Consumed 13.789s CPU time. Dec 6 03:37:30 localhost systemd-logind[760]: Session 21 logged out. Waiting for processes to exit. Dec 6 03:37:30 localhost systemd-logind[760]: Removed session 21. Dec 6 03:37:31 localhost podman[67594]: 2025-12-06 08:37:31.135276484 +0000 UTC m=+0.468394439 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container) Dec 6 03:37:31 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:37:34 localhost systemd[1]: tmp-crun.9iAzRC.mount: Deactivated successfully. Dec 6 03:37:34 localhost podman[67617]: 2025-12-06 08:37:34.568186105 +0000 UTC m=+0.100701440 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12) Dec 6 03:37:34 localhost podman[67617]: 2025-12-06 08:37:34.634260261 +0000 UTC m=+0.166775576 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 03:37:34 localhost systemd[1]: tmp-crun.HxyhvT.mount: Deactivated successfully. Dec 6 03:37:34 localhost podman[67618]: 2025-12-06 08:37:34.663957391 +0000 UTC m=+0.193717591 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 6 03:37:34 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:37:34 localhost podman[67618]: 2025-12-06 08:37:34.716333823 +0000 UTC m=+0.246094013 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git) Dec 6 03:37:34 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:37:36 localhost podman[67664]: 2025-12-06 08:37:36.513250771 +0000 UTC m=+0.049238777 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 6 03:37:36 localhost podman[67664]: 2025-12-06 08:37:36.523139997 +0000 UTC m=+0.059128023 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, com.redhat.component=openstack-collectd-container, tcib_managed=true, architecture=x86_64) Dec 6 03:37:36 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:37:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:37:38 localhost podman[67683]: 2025-12-06 08:37:38.546375755 +0000 UTC m=+0.081685593 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1) Dec 6 03:37:38 localhost podman[67683]: 2025-12-06 08:37:38.585420764 +0000 UTC m=+0.120730622 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container) Dec 6 03:37:38 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:37:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:37:41 localhost podman[67702]: 2025-12-06 08:37:41.556143379 +0000 UTC m=+0.086661995 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, container_name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:37:41 localhost podman[67702]: 2025-12-06 08:37:41.577838091 +0000 UTC m=+0.108356677 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1) Dec 6 03:37:41 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:37:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:37:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:37:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:37:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:37:59 localhost podman[67727]: 2025-12-06 08:37:59.550684541 +0000 UTC m=+0.083352903 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Dec 6 03:37:59 localhost podman[67728]: 2025-12-06 08:37:59.603565829 +0000 UTC m=+0.134779926 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Dec 6 03:37:59 localhost systemd[1]: tmp-crun.n2WX6L.mount: Deactivated successfully. Dec 6 03:37:59 localhost podman[67729]: 2025-12-06 08:37:59.661083591 +0000 UTC m=+0.187520520 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:37:59 localhost podman[67730]: 2025-12-06 08:37:59.714911858 +0000 UTC m=+0.239624064 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vendor=Red Hat, Inc., tcib_managed=true) Dec 6 03:37:59 localhost podman[67728]: 2025-12-06 08:37:59.728468318 +0000 UTC m=+0.259682365 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4) Dec 6 03:37:59 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:37:59 localhost podman[67729]: 2025-12-06 08:37:59.74274846 +0000 UTC m=+0.269185379 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:37:59 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:37:59 localhost podman[67727]: 2025-12-06 08:37:59.783547274 +0000 UTC m=+0.316215666 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Dec 6 03:37:59 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:37:59 localhost podman[67730]: 2025-12-06 08:37:59.911380923 +0000 UTC m=+0.436093179 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git) Dec 6 03:37:59 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:38:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:38:01 localhost podman[67827]: 2025-12-06 08:38:01.544818817 +0000 UTC m=+0.079928266 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.) Dec 6 03:38:01 localhost podman[67827]: 2025-12-06 08:38:01.938015356 +0000 UTC m=+0.473124825 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_migration_target, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 03:38:01 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:38:05 localhost systemd[1]: tmp-crun.WOSXQc.mount: Deactivated successfully. Dec 6 03:38:05 localhost podman[67851]: 2025-12-06 08:38:05.537138255 +0000 UTC m=+0.071609849 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z) Dec 6 03:38:05 localhost podman[67851]: 2025-12-06 08:38:05.572621924 +0000 UTC m=+0.107093478 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:38:05 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:38:05 localhost podman[67852]: 2025-12-06 08:38:05.588194387 +0000 UTC m=+0.118061729 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:38:05 localhost podman[67852]: 2025-12-06 08:38:05.613727217 +0000 UTC m=+0.143594619 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:38:05 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:38:07 localhost systemd[1]: tmp-crun.tZcGq6.mount: Deactivated successfully. Dec 6 03:38:07 localhost podman[67897]: 2025-12-06 08:38:07.554926124 +0000 UTC m=+0.084455918 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container) Dec 6 03:38:07 localhost podman[67897]: 2025-12-06 08:38:07.562640652 +0000 UTC m=+0.092170476 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, container_name=collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:38:07 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:38:08 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:38:08 localhost recover_tripleo_nova_virtqemud[67918]: 51836 Dec 6 03:38:08 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:38:08 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:38:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:38:09 localhost podman[67919]: 2025-12-06 08:38:09.542406973 +0000 UTC m=+0.077254295 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, container_name=iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:38:09 localhost podman[67919]: 2025-12-06 08:38:09.582388092 +0000 UTC m=+0.117235434 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:38:09 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:38:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:38:12 localhost podman[67937]: 2025-12-06 08:38:12.538315099 +0000 UTC m=+0.071757084 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=) Dec 6 03:38:12 localhost podman[67937]: 2025-12-06 08:38:12.594395006 +0000 UTC m=+0.127836971 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, container_name=nova_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:38:12 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:38:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:38:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:38:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:38:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:38:30 localhost podman[67963]: 2025-12-06 08:38:30.539106824 +0000 UTC m=+0.068954066 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:38:30 localhost podman[67963]: 2025-12-06 08:38:30.571121946 +0000 UTC m=+0.100969128 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, name=rhosp17/openstack-cron, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:38:30 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:38:30 localhost podman[67964]: 2025-12-06 08:38:30.614461768 +0000 UTC m=+0.138911174 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 6 03:38:30 localhost podman[67964]: 2025-12-06 08:38:30.664435246 +0000 UTC m=+0.188884612 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:38:30 localhost podman[67971]: 2025-12-06 08:38:30.672639591 +0000 UTC m=+0.190151831 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:38:30 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:38:30 localhost podman[67965]: 2025-12-06 08:38:30.708286395 +0000 UTC m=+0.229894433 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:38:30 localhost podman[67965]: 2025-12-06 08:38:30.760341337 +0000 UTC m=+0.281949385 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4) Dec 6 03:38:30 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:38:30 localhost podman[67971]: 2025-12-06 08:38:30.890109446 +0000 UTC m=+0.407621736 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:38:30 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:38:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:38:32 localhost podman[68062]: 2025-12-06 08:38:32.540482005 +0000 UTC m=+0.076198432 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:38:32 localhost podman[68062]: 2025-12-06 08:38:32.877198185 +0000 UTC m=+0.412914582 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 6 03:38:32 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:38:36 localhost systemd[1]: tmp-crun.PyewMv.mount: Deactivated successfully. Dec 6 03:38:36 localhost podman[68089]: 2025-12-06 08:38:36.559763078 +0000 UTC m=+0.085230260 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Dec 6 03:38:36 localhost podman[68089]: 2025-12-06 08:38:36.575979701 +0000 UTC m=+0.101446883 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z) Dec 6 03:38:36 localhost podman[68088]: 2025-12-06 08:38:36.532380271 +0000 UTC m=+0.065797870 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Dec 6 03:38:36 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:38:36 localhost podman[68088]: 2025-12-06 08:38:36.61341951 +0000 UTC m=+0.146837049 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, release=1761123044, vcs-type=git, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:14:25Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Dec 6 03:38:36 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:38:38 localhost podman[68145]: 2025-12-06 08:38:38.571421857 +0000 UTC m=+0.100022108 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, container_name=collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:38:38 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:38:38 localhost podman[68145]: 2025-12-06 08:38:38.606546796 +0000 UTC m=+0.135146997 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:38:38 localhost systemd[1]: Created slice User Slice of UID 0. Dec 6 03:38:38 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 6 03:38:38 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:38:38 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 6 03:38:38 localhost systemd[1]: Starting User Manager for UID 0... Dec 6 03:38:38 localhost systemd[68170]: Queued start job for default target Main User Target. Dec 6 03:38:38 localhost systemd[68170]: Created slice User Application Slice. Dec 6 03:38:38 localhost systemd[68170]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 6 03:38:38 localhost systemd[68170]: Started Daily Cleanup of User's Temporary Directories. Dec 6 03:38:38 localhost systemd[68170]: Reached target Paths. Dec 6 03:38:38 localhost systemd[68170]: Reached target Timers. Dec 6 03:38:38 localhost systemd[68170]: Starting D-Bus User Message Bus Socket... Dec 6 03:38:38 localhost systemd[68170]: Starting Create User's Volatile Files and Directories... Dec 6 03:38:38 localhost systemd[68170]: Listening on D-Bus User Message Bus Socket. Dec 6 03:38:38 localhost systemd[68170]: Reached target Sockets. Dec 6 03:38:38 localhost systemd[68170]: Finished Create User's Volatile Files and Directories. Dec 6 03:38:38 localhost systemd[68170]: Reached target Basic System. Dec 6 03:38:38 localhost systemd[68170]: Reached target Main User Target. Dec 6 03:38:38 localhost systemd[68170]: Startup finished in 157ms. Dec 6 03:38:38 localhost systemd[1]: Started User Manager for UID 0. Dec 6 03:38:38 localhost systemd[1]: Started Session c11 of User root. Dec 6 03:38:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:38:40 localhost podman[68222]: 2025-12-06 08:38:40.563905293 +0000 UTC m=+0.095913702 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:38:40 localhost systemd-logind[760]: Existing logind session ID 15 used by new audit session, ignoring. Dec 6 03:38:40 localhost podman[68222]: 2025-12-06 08:38:40.601351513 +0000 UTC m=+0.133359912 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, architecture=x86_64) Dec 6 03:38:40 localhost systemd[1]: Started Session c12 of User root. Dec 6 03:38:40 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:38:41 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Dec 6 03:38:41 localhost kernel: device tap227fe5b2-a5 entered promiscuous mode Dec 6 03:38:41 localhost NetworkManager[5965]: [1765010321.7254] manager: (tap227fe5b2-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Dec 6 03:38:41 localhost systemd-udevd[68267]: Network interface NamePolicy= disabled on kernel command line. Dec 6 03:38:41 localhost NetworkManager[5965]: [1765010321.7491] device (tap227fe5b2-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 03:38:41 localhost NetworkManager[5965]: [1765010321.7500] device (tap227fe5b2-a5): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 6 03:38:41 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 6 03:38:41 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Dec 6 03:38:41 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Dec 6 03:38:41 localhost systemd-machined[68273]: New machine qemu-1-instance-00000002. Dec 6 03:38:41 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Dec 6 03:38:42 localhost NetworkManager[5965]: [1765010322.0148] manager: (tap20509a6a-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Dec 6 03:38:42 localhost systemd-udevd[68265]: Network interface NamePolicy= disabled on kernel command line. Dec 6 03:38:42 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap20509a6a-c1: link becomes ready Dec 6 03:38:42 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap20509a6a-c0: link becomes ready Dec 6 03:38:42 localhost NetworkManager[5965]: [1765010322.0550] device (tap20509a6a-c0): carrier: link connected Dec 6 03:38:42 localhost kernel: device tap20509a6a-c0 entered promiscuous mode Dec 6 03:38:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:38:43 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Dec 6 03:38:43 localhost podman[68340]: 2025-12-06 08:38:43.54599956 +0000 UTC m=+0.089623507 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:38:43 localhost podman[68340]: 2025-12-06 08:38:43.578166847 +0000 UTC m=+0.121790734 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1) Dec 6 03:38:43 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:38:43 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Dec 6 03:38:43 localhost podman[68392]: 2025-12-06 08:38:43.791814244 +0000 UTC m=+0.091884937 container create 750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true) Dec 6 03:38:43 localhost podman[68392]: 2025-12-06 08:38:43.741723723 +0000 UTC m=+0.041794386 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 6 03:38:43 localhost systemd[1]: Started libpod-conmon-750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565.scope. Dec 6 03:38:43 localhost systemd[1]: tmp-crun.rmZZ0d.mount: Deactivated successfully. Dec 6 03:38:43 localhost systemd[1]: Started libcrun container. Dec 6 03:38:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e74f8bca4fd35e1e90cba34b801511904a738e55eeed6e861124d8e25abc2790/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 03:38:43 localhost podman[68392]: 2025-12-06 08:38:43.891212742 +0000 UTC m=+0.191283435 container init 750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public) Dec 6 03:38:43 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Dec 6 03:38:43 localhost podman[68392]: 2025-12-06 08:38:43.901053318 +0000 UTC m=+0.201124011 container start 750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:38:43 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Dec 6 03:38:44 localhost setroubleshoot[68341]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l e72bc9e7-2b90-4cf6-b52d-719ab939eaf0 Dec 6 03:38:44 localhost setroubleshoot[68341]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Dec 6 03:38:45 localhost snmpd[56894]: empty variable list in _query Dec 6 03:38:45 localhost snmpd[56894]: empty variable list in _query Dec 6 03:38:54 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Dec 6 03:38:54 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44634 [06/Dec/2025:08:38:58.390] listener listener/metadata 0/0/0/1617/1617 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44640 [06/Dec/2025:08:39:00.104] listener listener/metadata 0/0/0/75/75 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44652 [06/Dec/2025:08:39:00.248] listener listener/metadata 0/0/0/70/70 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44654 [06/Dec/2025:08:39:00.387] listener listener/metadata 0/0/0/69/69 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44666 [06/Dec/2025:08:39:00.527] listener listener/metadata 0/0/0/72/72 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44668 [06/Dec/2025:08:39:00.664] listener listener/metadata 0/0/0/69/69 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44678 [06/Dec/2025:08:39:00.802] listener listener/metadata 0/0/0/71/71 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Dec 6 03:39:00 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44688 [06/Dec/2025:08:39:00.926] listener listener/metadata 0/0/0/68/68 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44692 [06/Dec/2025:08:39:01.049] listener listener/metadata 0/0/0/70/70 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44702 [06/Dec/2025:08:39:01.187] listener listener/metadata 0/0/0/72/72 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44714 [06/Dec/2025:08:39:01.313] listener listener/metadata 0/0/0/70/70 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Dec 6 03:39:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:39:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:39:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:39:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44720 [06/Dec/2025:08:39:01.423] listener listener/metadata 0/0/0/52/52 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Dec 6 03:39:01 localhost systemd[1]: tmp-crun.mYS3XC.mount: Deactivated successfully. Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44724 [06/Dec/2025:08:39:01.536] listener listener/metadata 0/0/0/55/55 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Dec 6 03:39:01 localhost podman[68444]: 2025-12-06 08:39:01.618283611 +0000 UTC m=+0.134298901 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container) Dec 6 03:39:01 localhost podman[68441]: 2025-12-06 08:39:01.668062963 +0000 UTC m=+0.189496041 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond) Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44732 [06/Dec/2025:08:39:01.651] listener listener/metadata 0/0/0/62/62 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Dec 6 03:39:01 localhost podman[68443]: 2025-12-06 08:39:01.735031657 +0000 UTC m=+0.253197993 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, vcs-type=git, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:39:01 localhost podman[68442]: 2025-12-06 08:39:01.588673234 +0000 UTC m=+0.109435811 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12) Dec 6 03:39:01 localhost podman[68441]: 2025-12-06 08:39:01.754199571 +0000 UTC m=+0.275632609 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:39:01 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:39:01 localhost podman[68443]: 2025-12-06 08:39:01.769234796 +0000 UTC m=+0.287401162 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container) Dec 6 03:39:01 localhost podman[68442]: 2025-12-06 08:39:01.781249399 +0000 UTC m=+0.302012036 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:39:01 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:39:01 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44744 [06/Dec/2025:08:39:01.774] listener listener/metadata 0/0/0/62/62 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Dec 6 03:39:01 localhost podman[68444]: 2025-12-06 08:39:01.886178559 +0000 UTC m=+0.402193899 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:39:01 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:39:01 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[68420]: 192.168.0.189:44748 [06/Dec/2025:08:39:01.882] listener listener/metadata 0/0/0/67/67 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Dec 6 03:39:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:39:03 localhost systemd[1]: tmp-crun.adftwP.mount: Deactivated successfully. Dec 6 03:39:03 localhost podman[68544]: 2025-12-06 08:39:03.570308942 +0000 UTC m=+0.102011120 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public) Dec 6 03:39:03 localhost podman[68544]: 2025-12-06 08:39:03.943205872 +0000 UTC m=+0.474908100 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:39:03 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:39:07 localhost podman[68571]: 2025-12-06 08:39:07.556598644 +0000 UTC m=+0.087933964 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Dec 6 03:39:07 localhost podman[68571]: 2025-12-06 08:39:07.598633516 +0000 UTC m=+0.129968776 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent) Dec 6 03:39:07 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:39:07 localhost podman[68572]: 2025-12-06 08:39:07.610786292 +0000 UTC m=+0.138117409 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z) Dec 6 03:39:07 localhost podman[68572]: 2025-12-06 08:39:07.695304421 +0000 UTC m=+0.222635488 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ovn_controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:39:07 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:39:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:39:09 localhost systemd[1]: tmp-crun.pdQf95.mount: Deactivated successfully. Dec 6 03:39:09 localhost podman[68618]: 2025-12-06 08:39:09.568150698 +0000 UTC m=+0.103089944 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public) Dec 6 03:39:09 localhost podman[68618]: 2025-12-06 08:39:09.582173083 +0000 UTC m=+0.117112369 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Dec 6 03:39:09 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:39:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:39:11 localhost podman[68638]: 2025-12-06 08:39:11.546890778 +0000 UTC m=+0.078656958 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid) Dec 6 03:39:11 localhost podman[68638]: 2025-12-06 08:39:11.560295303 +0000 UTC m=+0.092061473 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team) Dec 6 03:39:11 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:39:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:39:14 localhost podman[68658]: 2025-12-06 08:39:14.537213 +0000 UTC m=+0.071066361 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5) Dec 6 03:39:14 localhost podman[68658]: 2025-12-06 08:39:14.590279684 +0000 UTC m=+0.124132965 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step5) Dec 6 03:39:14 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:39:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:39:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:39:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:39:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:39:32 localhost podman[68683]: 2025-12-06 08:39:32.560632327 +0000 UTC m=+0.092140785 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:39:32 localhost podman[68683]: 2025-12-06 08:39:32.568896623 +0000 UTC m=+0.100405101 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:39:32 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:39:32 localhost podman[68689]: 2025-12-06 08:39:32.622187354 +0000 UTC m=+0.143895009 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, version=17.1.12, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vendor=Red Hat, Inc.) Dec 6 03:39:32 localhost podman[68684]: 2025-12-06 08:39:32.658302473 +0000 UTC m=+0.185032644 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:39:32 localhost podman[68685]: 2025-12-06 08:39:32.715653559 +0000 UTC m=+0.238678724 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:39:32 localhost podman[68684]: 2025-12-06 08:39:32.742672526 +0000 UTC m=+0.269402657 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:39:32 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:39:32 localhost podman[68685]: 2025-12-06 08:39:32.77089624 +0000 UTC m=+0.293921415 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container) Dec 6 03:39:32 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:39:32 localhost podman[68689]: 2025-12-06 08:39:32.822186948 +0000 UTC m=+0.343894643 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git) Dec 6 03:39:32 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:39:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:39:34 localhost podman[68784]: 2025-12-06 08:39:34.593644997 +0000 UTC m=+0.086336245 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team) Dec 6 03:39:34 localhost podman[68784]: 2025-12-06 08:39:34.973254715 +0000 UTC m=+0.465945953 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4) Dec 6 03:39:34 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:39:38 localhost podman[68806]: 2025-12-06 08:39:38.552623703 +0000 UTC m=+0.085735437 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.buildah.version=1.41.4) Dec 6 03:39:38 localhost podman[68807]: 2025-12-06 08:39:38.613456917 +0000 UTC m=+0.142374961 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:39:38 localhost podman[68806]: 2025-12-06 08:39:38.618137062 +0000 UTC m=+0.151248746 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 6 03:39:38 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:39:38 localhost podman[68807]: 2025-12-06 08:39:38.667308566 +0000 UTC m=+0.196226580 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:39:38 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:39:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:39:40 localhost systemd[1]: tmp-crun.17bzLk.mount: Deactivated successfully. Dec 6 03:39:40 localhost podman[68865]: 2025-12-06 08:39:40.549906027 +0000 UTC m=+0.084549580 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12) Dec 6 03:39:40 localhost podman[68865]: 2025-12-06 08:39:40.587183912 +0000 UTC m=+0.121827465 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, com.redhat.component=openstack-collectd-container) Dec 6 03:39:40 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:39:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:39:42 localhost podman[68886]: 2025-12-06 08:39:42.547742579 +0000 UTC m=+0.083315222 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-iscsid-container) Dec 6 03:39:42 localhost podman[68886]: 2025-12-06 08:39:42.559381929 +0000 UTC m=+0.094954542 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Dec 6 03:39:42 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:39:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:39:45 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:39:45 localhost recover_tripleo_nova_virtqemud[68911]: 51836 Dec 6 03:39:45 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:39:45 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:39:45 localhost systemd[1]: tmp-crun.gN7JPf.mount: Deactivated successfully. Dec 6 03:39:45 localhost podman[68905]: 2025-12-06 08:39:45.569257737 +0000 UTC m=+0.099261285 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step5, vcs-type=git, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true) Dec 6 03:39:45 localhost podman[68905]: 2025-12-06 08:39:45.598554885 +0000 UTC m=+0.128558403 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team) Dec 6 03:39:45 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:40:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:40:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:40:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:40:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:40:03 localhost systemd[1]: tmp-crun.kGlmNN.mount: Deactivated successfully. Dec 6 03:40:03 localhost podman[68935]: 2025-12-06 08:40:03.566471453 +0000 UTC m=+0.097587904 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Dec 6 03:40:03 localhost podman[68935]: 2025-12-06 08:40:03.578302819 +0000 UTC m=+0.109419290 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 6 03:40:03 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:40:03 localhost podman[68937]: 2025-12-06 08:40:03.665642144 +0000 UTC m=+0.191636646 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, container_name=ceilometer_agent_compute, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:40:03 localhost podman[68936]: 2025-12-06 08:40:03.713628551 +0000 UTC m=+0.242741240 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:40:03 localhost podman[68937]: 2025-12-06 08:40:03.731244716 +0000 UTC m=+0.257239178 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1761123044, batch=17.1_20251118.1) Dec 6 03:40:03 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:40:03 localhost podman[68936]: 2025-12-06 08:40:03.745352114 +0000 UTC m=+0.274464803 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Dec 6 03:40:03 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:40:03 localhost podman[68938]: 2025-12-06 08:40:03.813230486 +0000 UTC m=+0.336994520 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc.) Dec 6 03:40:04 localhost podman[68938]: 2025-12-06 08:40:04.009510436 +0000 UTC m=+0.533274460 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 03:40:04 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:40:05 localhost systemd[1]: tmp-crun.T0wEHN.mount: Deactivated successfully. Dec 6 03:40:05 localhost podman[69037]: 2025-12-06 08:40:05.549625919 +0000 UTC m=+0.083499367 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Dec 6 03:40:05 localhost podman[69037]: 2025-12-06 08:40:05.94515431 +0000 UTC m=+0.479027698 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_migration_target) Dec 6 03:40:05 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:40:09 localhost podman[69074]: 2025-12-06 08:40:09.561801451 +0000 UTC m=+0.087575353 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:40:09 localhost podman[69075]: 2025-12-06 08:40:09.610408397 +0000 UTC m=+0.130681009 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, distribution-scope=public, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Dec 6 03:40:09 localhost podman[69075]: 2025-12-06 08:40:09.635222465 +0000 UTC m=+0.155495097 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Dec 6 03:40:09 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:40:09 localhost podman[69074]: 2025-12-06 08:40:09.686734511 +0000 UTC m=+0.212508413 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:40:09 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:40:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:40:11 localhost podman[69120]: 2025-12-06 08:40:11.569988612 +0000 UTC m=+0.094339004 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Dec 6 03:40:11 localhost podman[69120]: 2025-12-06 08:40:11.608224756 +0000 UTC m=+0.132575118 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Dec 6 03:40:11 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:40:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:40:13 localhost podman[69140]: 2025-12-06 08:40:13.59292791 +0000 UTC m=+0.076961335 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, release=1761123044, version=17.1.12, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:40:13 localhost podman[69140]: 2025-12-06 08:40:13.608246084 +0000 UTC m=+0.092279579 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1) Dec 6 03:40:13 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:40:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:40:16 localhost systemd[1]: tmp-crun.aS4NaR.mount: Deactivated successfully. Dec 6 03:40:16 localhost podman[69160]: 2025-12-06 08:40:16.555280385 +0000 UTC m=+0.090495144 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:40:16 localhost podman[69160]: 2025-12-06 08:40:16.613438597 +0000 UTC m=+0.148653366 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:40:16 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:40:34 localhost systemd[1]: tmp-crun.Bx99zA.mount: Deactivated successfully. Dec 6 03:40:34 localhost podman[69188]: 2025-12-06 08:40:34.577393788 +0000 UTC m=+0.091768427 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public) Dec 6 03:40:34 localhost podman[69187]: 2025-12-06 08:40:34.625692625 +0000 UTC m=+0.142160638 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Dec 6 03:40:34 localhost podman[69188]: 2025-12-06 08:40:34.655463179 +0000 UTC m=+0.169837878 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:40:34 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:40:34 localhost podman[69187]: 2025-12-06 08:40:34.683314975 +0000 UTC m=+0.199782998 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public) Dec 6 03:40:34 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:40:34 localhost podman[69189]: 2025-12-06 08:40:34.738545462 +0000 UTC m=+0.249978232 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:40:34 localhost podman[69186]: 2025-12-06 08:40:34.784753116 +0000 UTC m=+0.305805278 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=logrotate_crond, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, architecture=x86_64, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:40:34 localhost podman[69186]: 2025-12-06 08:40:34.794294235 +0000 UTC m=+0.315346427 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:40:34 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:40:34 localhost podman[69189]: 2025-12-06 08:40:34.940808944 +0000 UTC m=+0.452241694 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z) Dec 6 03:40:34 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:40:36 localhost podman[69289]: 2025-12-06 08:40:36.531289892 +0000 UTC m=+0.068626015 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:40:36 localhost podman[69289]: 2025-12-06 08:40:36.931354101 +0000 UTC m=+0.468690254 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4) Dec 6 03:40:36 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:40:40 localhost podman[69313]: 2025-12-06 08:40:40.551346489 +0000 UTC m=+0.082923110 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, vcs-type=git, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, architecture=x86_64) Dec 6 03:40:40 localhost podman[69312]: 2025-12-06 08:40:40.60506655 +0000 UTC m=+0.139911110 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:40:40 localhost podman[69313]: 2025-12-06 08:40:40.62747842 +0000 UTC m=+0.159055071 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container) Dec 6 03:40:40 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:40:40 localhost podman[69312]: 2025-12-06 08:40:40.649350825 +0000 UTC m=+0.184195425 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:40:40 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:40:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:40:42 localhost podman[69372]: 2025-12-06 08:40:42.541010279 +0000 UTC m=+0.075286298 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-type=git, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:40:42 localhost podman[69372]: 2025-12-06 08:40:42.554139368 +0000 UTC m=+0.088415387 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:40:42 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:40:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:40:44 localhost podman[69392]: 2025-12-06 08:40:44.540252759 +0000 UTC m=+0.074993878 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container) Dec 6 03:40:44 localhost podman[69392]: 2025-12-06 08:40:44.552410938 +0000 UTC m=+0.087152077 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:44:13Z, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:40:44 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:40:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:40:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:40:47 localhost recover_tripleo_nova_virtqemud[69418]: 51836 Dec 6 03:40:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:40:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:40:47 localhost podman[69411]: 2025-12-06 08:40:47.551770579 +0000 UTC m=+0.083233739 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step5, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z) Dec 6 03:40:47 localhost podman[69411]: 2025-12-06 08:40:47.581348047 +0000 UTC m=+0.112811177 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:40:47 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:41:05 localhost podman[69442]: 2025-12-06 08:41:05.570042239 +0000 UTC m=+0.096709508 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute) Dec 6 03:41:05 localhost podman[69443]: 2025-12-06 08:41:05.612687044 +0000 UTC m=+0.136061903 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr) Dec 6 03:41:05 localhost systemd[1]: tmp-crun.pYuNKF.mount: Deactivated successfully. Dec 6 03:41:05 localhost podman[69440]: 2025-12-06 08:41:05.660709072 +0000 UTC m=+0.192172156 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:41:05 localhost podman[69440]: 2025-12-06 08:41:05.671271033 +0000 UTC m=+0.202734077 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 6 03:41:05 localhost podman[69442]: 2025-12-06 08:41:05.679752661 +0000 UTC m=+0.206419990 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:41:05 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:41:05 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:41:05 localhost podman[69441]: 2025-12-06 08:41:05.786354058 +0000 UTC m=+0.316451511 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:41:05 localhost podman[69441]: 2025-12-06 08:41:05.816307218 +0000 UTC m=+0.346404681 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:41:05 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:41:05 localhost podman[69443]: 2025-12-06 08:41:05.848358361 +0000 UTC m=+0.371733180 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 6 03:41:05 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:41:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:41:07 localhost systemd[1]: tmp-crun.gQbpD7.mount: Deactivated successfully. Dec 6 03:41:07 localhost podman[69544]: 2025-12-06 08:41:07.544595941 +0000 UTC m=+0.084031073 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:41:07 localhost podman[69544]: 2025-12-06 08:41:07.91139789 +0000 UTC m=+0.450833022 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, distribution-scope=public, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-compute-container) Dec 6 03:41:07 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:41:11 localhost systemd[1]: tmp-crun.NFvzKA.mount: Deactivated successfully. Dec 6 03:41:11 localhost podman[69568]: 2025-12-06 08:41:11.562952506 +0000 UTC m=+0.092970984 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:41:11 localhost podman[69569]: 2025-12-06 08:41:11.617235874 +0000 UTC m=+0.144164219 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:41:11 localhost podman[69568]: 2025-12-06 08:41:11.640322996 +0000 UTC m=+0.170341444 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 6 03:41:11 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:41:11 localhost podman[69569]: 2025-12-06 08:41:11.668282135 +0000 UTC m=+0.195210500 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1) Dec 6 03:41:11 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:41:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:41:13 localhost podman[69615]: 2025-12-06 08:41:13.553312798 +0000 UTC m=+0.087233241 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:41:13 localhost podman[69615]: 2025-12-06 08:41:13.564449736 +0000 UTC m=+0.098370189 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step3) Dec 6 03:41:13 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:41:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:41:15 localhost systemd[1]: tmp-crun.5E0WQx.mount: Deactivated successfully. Dec 6 03:41:15 localhost podman[69635]: 2025-12-06 08:41:15.528470817 +0000 UTC m=+0.064108108 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-iscsid, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public) Dec 6 03:41:15 localhost podman[69635]: 2025-12-06 08:41:15.563816079 +0000 UTC m=+0.099453390 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1761123044, tcib_managed=true, name=rhosp17/openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:41:15 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:41:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:41:18 localhost systemd[1]: tmp-crun.dVPah5.mount: Deactivated successfully. Dec 6 03:41:18 localhost podman[69655]: 2025-12-06 08:41:18.544716649 +0000 UTC m=+0.079903137 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:41:18 localhost podman[69655]: 2025-12-06 08:41:18.565384198 +0000 UTC m=+0.100570706 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_compute, build-date=2025-11-19T00:36:58Z, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12) Dec 6 03:41:18 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:41:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:41:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:41:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:41:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:41:36 localhost systemd[1]: tmp-crun.OlFA7T.mount: Deactivated successfully. Dec 6 03:41:36 localhost podman[69685]: 2025-12-06 08:41:36.543074181 +0000 UTC m=+0.077076621 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:41:36 localhost systemd[1]: tmp-crun.0C21pT.mount: Deactivated successfully. Dec 6 03:41:36 localhost podman[69685]: 2025-12-06 08:41:36.587986615 +0000 UTC m=+0.121988995 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, release=1761123044) Dec 6 03:41:36 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:41:36 localhost podman[69683]: 2025-12-06 08:41:36.635804517 +0000 UTC m=+0.170658023 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 6 03:41:36 localhost podman[69686]: 2025-12-06 08:41:36.588869802 +0000 UTC m=+0.120098498 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, container_name=metrics_qdr, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team) Dec 6 03:41:36 localhost podman[69684]: 2025-12-06 08:41:36.705891435 +0000 UTC m=+0.237818243 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Dec 6 03:41:36 localhost podman[69683]: 2025-12-06 08:41:36.717287321 +0000 UTC m=+0.252140787 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 03:41:36 localhost podman[69684]: 2025-12-06 08:41:36.726838082 +0000 UTC m=+0.258764890 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:41:36 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:41:36 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:41:36 localhost podman[69686]: 2025-12-06 08:41:36.802611343 +0000 UTC m=+0.333840049 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, version=17.1.12, architecture=x86_64, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible) Dec 6 03:41:36 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:41:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:41:38 localhost podman[69784]: 2025-12-06 08:41:38.545498758 +0000 UTC m=+0.081196327 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, release=1761123044, io.openshift.expose-services=, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:41:38 localhost podman[69784]: 2025-12-06 08:41:38.919222257 +0000 UTC m=+0.454919766 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 6 03:41:38 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:41:42 localhost systemd[1]: tmp-crun.B0U8TZ.mount: Deactivated successfully. Dec 6 03:41:42 localhost podman[69807]: 2025-12-06 08:41:42.543062712 +0000 UTC m=+0.076490893 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z) Dec 6 03:41:42 localhost podman[69808]: 2025-12-06 08:41:42.583217961 +0000 UTC m=+0.111549609 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}) Dec 6 03:41:42 localhost podman[69807]: 2025-12-06 08:41:42.619873304 +0000 UTC m=+0.153301505 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:41:42 localhost podman[69808]: 2025-12-06 08:41:42.629385134 +0000 UTC m=+0.157716792 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1) Dec 6 03:41:42 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:41:42 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:41:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:41:44 localhost podman[69868]: 2025-12-06 08:41:44.551813451 +0000 UTC m=+0.082335231 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:41:44 localhost podman[69868]: 2025-12-06 08:41:44.567326561 +0000 UTC m=+0.097848301 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64) Dec 6 03:41:44 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:41:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:41:46 localhost systemd[1]: tmp-crun.SkeY34.mount: Deactivated successfully. Dec 6 03:41:46 localhost podman[69888]: 2025-12-06 08:41:46.547921435 +0000 UTC m=+0.082964559 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 6 03:41:46 localhost podman[69888]: 2025-12-06 08:41:46.582253389 +0000 UTC m=+0.117296453 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com) Dec 6 03:41:46 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:41:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:41:49 localhost podman[69909]: 2025-12-06 08:41:49.55593757 +0000 UTC m=+0.092007325 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, release=1761123044, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Dec 6 03:41:49 localhost podman[69909]: 2025-12-06 08:41:49.586360684 +0000 UTC m=+0.122430429 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:41:49 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:42:07 localhost systemd[1]: tmp-crun.LOP92n.mount: Deactivated successfully. Dec 6 03:42:07 localhost podman[69936]: 2025-12-06 08:42:07.599073084 +0000 UTC m=+0.121990526 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:42:07 localhost systemd[1]: tmp-crun.WYxCkz.mount: Deactivated successfully. Dec 6 03:42:07 localhost podman[69935]: 2025-12-06 08:42:07.627804746 +0000 UTC m=+0.156392610 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, batch=17.1_20251118.1, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:42:07 localhost podman[69937]: 2025-12-06 08:42:07.680666292 +0000 UTC m=+0.199200380 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, architecture=x86_64) Dec 6 03:42:07 localhost podman[69935]: 2025-12-06 08:42:07.709031313 +0000 UTC m=+0.237619117 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, tcib_managed=true, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond) Dec 6 03:42:07 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:42:07 localhost podman[69943]: 2025-12-06 08:42:07.737553249 +0000 UTC m=+0.254665984 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, config_id=tripleo_step1, vcs-type=git, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z) Dec 6 03:42:07 localhost podman[69937]: 2025-12-06 08:42:07.763686693 +0000 UTC m=+0.282220801 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64) Dec 6 03:42:07 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:42:07 localhost podman[69936]: 2025-12-06 08:42:07.786355421 +0000 UTC m=+0.309272823 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, build-date=2025-11-19T00:12:45Z, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:42:07 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:42:07 localhost podman[69943]: 2025-12-06 08:42:07.954766475 +0000 UTC m=+0.471879130 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:42:07 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:42:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:42:09 localhost podman[70036]: 2025-12-06 08:42:09.549211023 +0000 UTC m=+0.084707893 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z) Dec 6 03:42:09 localhost podman[70036]: 2025-12-06 08:42:09.887663861 +0000 UTC m=+0.423160731 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:42:09 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:42:13 localhost podman[70060]: 2025-12-06 08:42:13.555913812 +0000 UTC m=+0.089360625 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 6 03:42:13 localhost podman[70061]: 2025-12-06 08:42:13.61609066 +0000 UTC m=+0.146419437 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4) Dec 6 03:42:13 localhost podman[70060]: 2025-12-06 08:42:13.622390511 +0000 UTC m=+0.155837294 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64) Dec 6 03:42:13 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:42:13 localhost podman[70061]: 2025-12-06 08:42:13.663761937 +0000 UTC m=+0.194090664 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_id=tripleo_step4, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:42:13 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:42:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:42:15 localhost systemd[1]: tmp-crun.kFgUwl.mount: Deactivated successfully. Dec 6 03:42:15 localhost podman[70109]: 2025-12-06 08:42:15.56128496 +0000 UTC m=+0.096227253 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=collectd, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, version=17.1.12) Dec 6 03:42:15 localhost podman[70109]: 2025-12-06 08:42:15.595737236 +0000 UTC m=+0.130679519 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Dec 6 03:42:15 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:42:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:42:17 localhost podman[70130]: 2025-12-06 08:42:17.544316018 +0000 UTC m=+0.079207086 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1) Dec 6 03:42:17 localhost podman[70130]: 2025-12-06 08:42:17.557268402 +0000 UTC m=+0.092159440 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:42:17 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:42:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:42:20 localhost podman[70149]: 2025-12-06 08:42:20.552353242 +0000 UTC m=+0.084416004 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 03:42:20 localhost podman[70149]: 2025-12-06 08:42:20.580146746 +0000 UTC m=+0.112209468 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:42:20 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:42:38 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:42:38 localhost recover_tripleo_nova_virtqemud[70195]: 51836 Dec 6 03:42:38 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:42:38 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:42:38 localhost systemd[1]: tmp-crun.nFEcmM.mount: Deactivated successfully. Dec 6 03:42:38 localhost podman[70175]: 2025-12-06 08:42:38.580065402 +0000 UTC m=+0.109969320 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond) Dec 6 03:42:38 localhost podman[70179]: 2025-12-06 08:42:38.581369842 +0000 UTC m=+0.097459010 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:42:38 localhost podman[70175]: 2025-12-06 08:42:38.619228492 +0000 UTC m=+0.149132410 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:42:38 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:42:38 localhost podman[70176]: 2025-12-06 08:42:38.623390688 +0000 UTC m=+0.149047077 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Dec 6 03:42:38 localhost podman[70176]: 2025-12-06 08:42:38.707302017 +0000 UTC m=+0.232958346 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Dec 6 03:42:38 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:42:38 localhost podman[70177]: 2025-12-06 08:42:38.676985916 +0000 UTC m=+0.200592363 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git) Dec 6 03:42:38 localhost podman[70177]: 2025-12-06 08:42:38.760332686 +0000 UTC m=+0.283939133 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4) Dec 6 03:42:38 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:42:38 localhost podman[70179]: 2025-12-06 08:42:38.796157334 +0000 UTC m=+0.312246492 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:42:38 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:42:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:42:40 localhost podman[70277]: 2025-12-06 08:42:40.553906502 +0000 UTC m=+0.090479449 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:42:40 localhost podman[70277]: 2025-12-06 08:42:40.883324955 +0000 UTC m=+0.419897972 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=) Dec 6 03:42:40 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:42:44 localhost podman[70300]: 2025-12-06 08:42:44.549029381 +0000 UTC m=+0.082580609 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Dec 6 03:42:44 localhost systemd[1]: tmp-crun.Q9X8U6.mount: Deactivated successfully. Dec 6 03:42:44 localhost podman[70300]: 2025-12-06 08:42:44.61847852 +0000 UTC m=+0.152029798 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 6 03:42:44 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:42:44 localhost podman[70301]: 2025-12-06 08:42:44.622794221 +0000 UTC m=+0.151210953 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044) Dec 6 03:42:44 localhost podman[70301]: 2025-12-06 08:42:44.708427502 +0000 UTC m=+0.236844244 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible) Dec 6 03:42:44 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:42:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:42:46 localhost podman[70361]: 2025-12-06 08:42:46.562896826 +0000 UTC m=+0.086919100 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:42:46 localhost podman[70361]: 2025-12-06 08:42:46.573808547 +0000 UTC m=+0.097830792 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3) Dec 6 03:42:46 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:42:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:42:48 localhost systemd[1]: tmp-crun.lz2XPB.mount: Deactivated successfully. Dec 6 03:42:48 localhost podman[70379]: 2025-12-06 08:42:48.571982356 +0000 UTC m=+0.100505304 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-iscsid-container, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12) Dec 6 03:42:48 localhost podman[70379]: 2025-12-06 08:42:48.585313191 +0000 UTC m=+0.113836139 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=iscsid) Dec 6 03:42:48 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:42:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:42:51 localhost podman[70398]: 2025-12-06 08:42:51.54744543 +0000 UTC m=+0.082574338 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team) Dec 6 03:42:51 localhost podman[70398]: 2025-12-06 08:42:51.601212202 +0000 UTC m=+0.136341120 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Dec 6 03:42:51 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:43:09 localhost podman[70426]: 2025-12-06 08:43:09.589960151 +0000 UTC m=+0.115228851 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:11:48Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible) Dec 6 03:43:09 localhost podman[70428]: 2025-12-06 08:43:09.613966409 +0000 UTC m=+0.129990358 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 6 03:43:09 localhost podman[70424]: 2025-12-06 08:43:09.642272998 +0000 UTC m=+0.171302512 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Dec 6 03:43:09 localhost podman[70425]: 2025-12-06 08:43:09.696934729 +0000 UTC m=+0.220974842 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.) Dec 6 03:43:09 localhost podman[70425]: 2025-12-06 08:43:09.72070686 +0000 UTC m=+0.244746963 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, batch=17.1_20251118.1) Dec 6 03:43:09 localhost podman[70424]: 2025-12-06 08:43:09.727215639 +0000 UTC m=+0.256245153 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible) Dec 6 03:43:09 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:43:09 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:43:09 localhost podman[70426]: 2025-12-06 08:43:09.773999299 +0000 UTC m=+0.299268029 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:43:09 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:43:09 localhost podman[70428]: 2025-12-06 08:43:09.797355938 +0000 UTC m=+0.313379897 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:43:09 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:43:10 localhost systemd[1]: tmp-crun.E0HRse.mount: Deactivated successfully. Dec 6 03:43:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:43:11 localhost podman[70523]: 2025-12-06 08:43:11.544341518 +0000 UTC m=+0.078923667 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 6 03:43:11 localhost podman[70523]: 2025-12-06 08:43:11.922195832 +0000 UTC m=+0.456777951 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z) Dec 6 03:43:11 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:43:15 localhost systemd[1]: tmp-crun.6cQLfA.mount: Deactivated successfully. Dec 6 03:43:15 localhost podman[70546]: 2025-12-06 08:43:15.568143709 +0000 UTC m=+0.099246255 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_metadata_agent) Dec 6 03:43:15 localhost podman[70546]: 2025-12-06 08:43:15.617399644 +0000 UTC m=+0.148502240 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:43:15 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:43:15 localhost systemd[1]: tmp-crun.a4OCkV.mount: Deactivated successfully. Dec 6 03:43:15 localhost podman[70547]: 2025-12-06 08:43:15.710553453 +0000 UTC m=+0.236068180 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12) Dec 6 03:43:15 localhost podman[70547]: 2025-12-06 08:43:15.743294377 +0000 UTC m=+0.268809104 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-type=git, distribution-scope=public, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z) Dec 6 03:43:15 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:43:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:43:17 localhost podman[70594]: 2025-12-06 08:43:17.552005312 +0000 UTC m=+0.083520827 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 6 03:43:17 localhost podman[70594]: 2025-12-06 08:43:17.563295325 +0000 UTC m=+0.094810860 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-collectd, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 6 03:43:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:43:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:43:19 localhost podman[70614]: 2025-12-06 08:43:19.54754667 +0000 UTC m=+0.082548277 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 6 03:43:19 localhost podman[70614]: 2025-12-06 08:43:19.586622437 +0000 UTC m=+0.121624054 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-iscsid-container) Dec 6 03:43:19 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:43:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:43:22 localhost podman[70633]: 2025-12-06 08:43:22.547656544 +0000 UTC m=+0.081708932 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git) Dec 6 03:43:22 localhost podman[70633]: 2025-12-06 08:43:22.596181487 +0000 UTC m=+0.130233815 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:43:22 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:43:40 localhost systemd[68170]: Created slice User Background Tasks Slice. Dec 6 03:43:40 localhost podman[70660]: 2025-12-06 08:43:40.562211896 +0000 UTC m=+0.086437646 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 6 03:43:40 localhost systemd[68170]: Starting Cleanup of User's Temporary Files and Directories... Dec 6 03:43:40 localhost systemd[68170]: Finished Cleanup of User's Temporary Files and Directories. Dec 6 03:43:40 localhost podman[70660]: 2025-12-06 08:43:40.616294048 +0000 UTC m=+0.140519808 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:43:40 localhost systemd[1]: tmp-crun.5TREgr.mount: Deactivated successfully. Dec 6 03:43:40 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:43:40 localhost podman[70658]: 2025-12-06 08:43:40.626090815 +0000 UTC m=+0.156106991 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true) Dec 6 03:43:40 localhost podman[70659]: 2025-12-06 08:43:40.678870898 +0000 UTC m=+0.204507682 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64) Dec 6 03:43:40 localhost podman[70658]: 2025-12-06 08:43:40.760670981 +0000 UTC m=+0.290687227 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1) Dec 6 03:43:40 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:43:40 localhost podman[70662]: 2025-12-06 08:43:40.734167767 +0000 UTC m=+0.253098626 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:43:40 localhost podman[70659]: 2025-12-06 08:43:40.815188267 +0000 UTC m=+0.340825021 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 6 03:43:40 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:43:40 localhost podman[70662]: 2025-12-06 08:43:40.949317321 +0000 UTC m=+0.468248170 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:43:40 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:43:41 localhost systemd[1]: tmp-crun.967BA5.mount: Deactivated successfully. Dec 6 03:43:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:43:42 localhost podman[70760]: 2025-12-06 08:43:42.545734019 +0000 UTC m=+0.077417012 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 03:43:42 localhost podman[70760]: 2025-12-06 08:43:42.877246656 +0000 UTC m=+0.408929569 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, container_name=nova_migration_target) Dec 6 03:43:42 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:43:46 localhost systemd[1]: tmp-crun.74JTGA.mount: Deactivated successfully. Dec 6 03:43:46 localhost podman[70784]: 2025-12-06 08:43:46.547621553 +0000 UTC m=+0.081385313 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:43:46 localhost podman[70784]: 2025-12-06 08:43:46.581198212 +0000 UTC m=+0.114961982 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ovn_metadata_agent, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git) Dec 6 03:43:46 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:43:46 localhost systemd[1]: tmp-crun.MS1TOS.mount: Deactivated successfully. Dec 6 03:43:46 localhost podman[70785]: 2025-12-06 08:43:46.610911925 +0000 UTC m=+0.141138257 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z) Dec 6 03:43:46 localhost podman[70785]: 2025-12-06 08:43:46.65621064 +0000 UTC m=+0.186436972 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., container_name=ovn_controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044) Dec 6 03:43:46 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:43:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:43:48 localhost podman[70843]: 2025-12-06 08:43:48.550939126 +0000 UTC m=+0.085118576 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=) Dec 6 03:43:48 localhost podman[70843]: 2025-12-06 08:43:48.565261531 +0000 UTC m=+0.099440971 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Dec 6 03:43:48 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:43:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:43:50 localhost systemd[1]: tmp-crun.JDkYbW.mount: Deactivated successfully. Dec 6 03:43:50 localhost podman[70864]: 2025-12-06 08:43:50.543966128 +0000 UTC m=+0.079172766 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, container_name=iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Dec 6 03:43:50 localhost podman[70864]: 2025-12-06 08:43:50.551834927 +0000 UTC m=+0.087041545 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, config_id=tripleo_step3, container_name=iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container) Dec 6 03:43:50 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:43:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:43:53 localhost podman[70885]: 2025-12-06 08:43:53.550127515 +0000 UTC m=+0.085049694 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, version=17.1.12) Dec 6 03:43:53 localhost podman[70885]: 2025-12-06 08:43:53.579084284 +0000 UTC m=+0.114006473 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Dec 6 03:43:53 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:44:11 localhost systemd[1]: tmp-crun.G5Klb2.mount: Deactivated successfully. Dec 6 03:44:11 localhost podman[70912]: 2025-12-06 08:44:11.559211999 +0000 UTC m=+0.092636804 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.expose-services=, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:44:11 localhost podman[70912]: 2025-12-06 08:44:11.563558431 +0000 UTC m=+0.096983226 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, distribution-scope=public, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Dec 6 03:44:11 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:44:11 localhost podman[70918]: 2025-12-06 08:44:11.606945639 +0000 UTC m=+0.129385520 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-type=git, architecture=x86_64) Dec 6 03:44:11 localhost podman[70913]: 2025-12-06 08:44:11.670166489 +0000 UTC m=+0.195654543 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Dec 6 03:44:11 localhost podman[70914]: 2025-12-06 08:44:11.715433653 +0000 UTC m=+0.241263317 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044) Dec 6 03:44:11 localhost podman[70913]: 2025-12-06 08:44:11.72518962 +0000 UTC m=+0.250677624 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Dec 6 03:44:11 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:44:11 localhost podman[70914]: 2025-12-06 08:44:11.773410444 +0000 UTC m=+0.299240118 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z) Dec 6 03:44:11 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:44:11 localhost podman[70918]: 2025-12-06 08:44:11.800869028 +0000 UTC m=+0.323308889 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Dec 6 03:44:11 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:44:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:44:13 localhost podman[71010]: 2025-12-06 08:44:13.565314109 +0000 UTC m=+0.090624344 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=nova_migration_target, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Dec 6 03:44:13 localhost podman[71010]: 2025-12-06 08:44:13.956316892 +0000 UTC m=+0.481627127 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Dec 6 03:44:13 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:44:17 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:44:17 localhost recover_tripleo_nova_virtqemud[71042]: 51836 Dec 6 03:44:17 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:44:17 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:44:17 localhost systemd[1]: tmp-crun.Fmn61J.mount: Deactivated successfully. Dec 6 03:44:17 localhost podman[71034]: 2025-12-06 08:44:17.56713287 +0000 UTC m=+0.100478072 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Dec 6 03:44:17 localhost podman[71034]: 2025-12-06 08:44:17.609097775 +0000 UTC m=+0.142442947 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1) Dec 6 03:44:17 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:44:17 localhost podman[71035]: 2025-12-06 08:44:17.655399951 +0000 UTC m=+0.185173555 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4) Dec 6 03:44:17 localhost podman[71035]: 2025-12-06 08:44:17.679267475 +0000 UTC m=+0.209041029 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:44:17 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:44:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:44:19 localhost systemd[1]: tmp-crun.Gf781p.mount: Deactivated successfully. Dec 6 03:44:19 localhost podman[71082]: 2025-12-06 08:44:19.551458138 +0000 UTC m=+0.085575520 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:44:19 localhost podman[71082]: 2025-12-06 08:44:19.563257817 +0000 UTC m=+0.097375149 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 6 03:44:19 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:44:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:44:21 localhost podman[71102]: 2025-12-06 08:44:21.544108528 +0000 UTC m=+0.079397883 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vendor=Red Hat, Inc.) Dec 6 03:44:21 localhost podman[71102]: 2025-12-06 08:44:21.582221885 +0000 UTC m=+0.117511230 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:44:21 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:44:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:44:24 localhost podman[71121]: 2025-12-06 08:44:24.525408681 +0000 UTC m=+0.064669994 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, name=rhosp17/openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:44:24 localhost podman[71121]: 2025-12-06 08:44:24.552176134 +0000 UTC m=+0.091437397 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:44:24 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:44:42 localhost podman[71147]: 2025-12-06 08:44:42.562391696 +0000 UTC m=+0.081794850 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:44:42 localhost systemd[1]: tmp-crun.u8DFtE.mount: Deactivated successfully. Dec 6 03:44:42 localhost podman[71147]: 2025-12-06 08:44:42.615197308 +0000 UTC m=+0.134600402 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 6 03:44:42 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:44:42 localhost podman[71148]: 2025-12-06 08:44:42.666253318 +0000 UTC m=+0.179456460 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:44:42 localhost podman[71146]: 2025-12-06 08:44:42.616340794 +0000 UTC m=+0.140705551 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-cron, container_name=logrotate_crond, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 6 03:44:42 localhost podman[71154]: 2025-12-06 08:44:42.717933465 +0000 UTC m=+0.228888958 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64) Dec 6 03:44:42 localhost podman[71148]: 2025-12-06 08:44:42.749139979 +0000 UTC m=+0.262343111 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 6 03:44:42 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:44:42 localhost podman[71146]: 2025-12-06 08:44:42.802185989 +0000 UTC m=+0.326550746 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, url=https://www.redhat.com, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, distribution-scope=public) Dec 6 03:44:42 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:44:42 localhost podman[71154]: 2025-12-06 08:44:42.873561116 +0000 UTC m=+0.384516659 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 6 03:44:42 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:44:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:44:44 localhost podman[71243]: 2025-12-06 08:44:44.552001095 +0000 UTC m=+0.079455378 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true) Dec 6 03:44:44 localhost podman[71243]: 2025-12-06 08:44:44.939179785 +0000 UTC m=+0.466634058 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container) Dec 6 03:44:44 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:44:48 localhost systemd[1]: tmp-crun.qJeYSS.mount: Deactivated successfully. Dec 6 03:44:48 localhost podman[71278]: 2025-12-06 08:44:48.559927029 +0000 UTC m=+0.086893028 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true) Dec 6 03:44:48 localhost systemd[1]: tmp-crun.kSBeBK.mount: Deactivated successfully. Dec 6 03:44:48 localhost podman[71278]: 2025-12-06 08:44:48.614499026 +0000 UTC m=+0.141464975 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:44:48 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:44:48 localhost podman[71279]: 2025-12-06 08:44:48.617954393 +0000 UTC m=+0.142638801 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, container_name=ovn_controller, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:44:48 localhost podman[71279]: 2025-12-06 08:44:48.702375553 +0000 UTC m=+0.227059911 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z) Dec 6 03:44:48 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:44:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:44:50 localhost podman[71328]: 2025-12-06 08:44:50.55313038 +0000 UTC m=+0.083736320 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, container_name=collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, release=1761123044) Dec 6 03:44:50 localhost podman[71328]: 2025-12-06 08:44:50.564398928 +0000 UTC m=+0.095004878 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z) Dec 6 03:44:50 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:44:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:44:52 localhost podman[71348]: 2025-12-06 08:44:52.542139191 +0000 UTC m=+0.071582375 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64) Dec 6 03:44:52 localhost podman[71348]: 2025-12-06 08:44:52.555113782 +0000 UTC m=+0.084556956 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:44:52 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:44:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:44:55 localhost podman[71365]: 2025-12-06 08:44:55.543626632 +0000 UTC m=+0.079215620 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5) Dec 6 03:44:55 localhost podman[71365]: 2025-12-06 08:44:55.576190709 +0000 UTC m=+0.111779707 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Dec 6 03:44:55 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:45:13 localhost podman[71396]: 2025-12-06 08:45:13.567899579 +0000 UTC m=+0.091913672 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 6 03:45:13 localhost podman[71393]: 2025-12-06 08:45:13.615591973 +0000 UTC m=+0.147213061 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 6 03:45:13 localhost podman[71393]: 2025-12-06 08:45:13.623319592 +0000 UTC m=+0.154940750 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, distribution-scope=public, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:45:13 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:45:13 localhost podman[71395]: 2025-12-06 08:45:13.731125916 +0000 UTC m=+0.257512392 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 03:45:13 localhost podman[71396]: 2025-12-06 08:45:13.776435776 +0000 UTC m=+0.300449949 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:45:13 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:45:13 localhost podman[71395]: 2025-12-06 08:45:13.789876582 +0000 UTC m=+0.316263148 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:45:13 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:45:13 localhost podman[71394]: 2025-12-06 08:45:13.776941692 +0000 UTC m=+0.303431721 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Dec 6 03:45:13 localhost podman[71394]: 2025-12-06 08:45:13.860279729 +0000 UTC m=+0.386769778 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Dec 6 03:45:13 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:45:14 localhost systemd[1]: tmp-crun.u6aivv.mount: Deactivated successfully. Dec 6 03:45:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:45:15 localhost systemd[1]: tmp-crun.xeJ48X.mount: Deactivated successfully. Dec 6 03:45:15 localhost podman[71492]: 2025-12-06 08:45:15.559644984 +0000 UTC m=+0.091258711 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, container_name=nova_migration_target, vcs-type=git, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Dec 6 03:45:15 localhost podman[71492]: 2025-12-06 08:45:15.926214347 +0000 UTC m=+0.457828064 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 03:45:15 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:45:19 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:45:19 localhost recover_tripleo_nova_virtqemud[71528]: 51836 Dec 6 03:45:19 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:45:19 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:45:19 localhost systemd[1]: tmp-crun.uJDTVR.mount: Deactivated successfully. Dec 6 03:45:19 localhost podman[71516]: 2025-12-06 08:45:19.536949484 +0000 UTC m=+0.071863843 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:45:19 localhost podman[71516]: 2025-12-06 08:45:19.5510541 +0000 UTC m=+0.085968449 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z) Dec 6 03:45:19 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:45:19 localhost systemd[1]: tmp-crun.FxVlx8.mount: Deactivated successfully. Dec 6 03:45:19 localhost podman[71515]: 2025-12-06 08:45:19.595716991 +0000 UTC m=+0.128255336 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 03:45:19 localhost podman[71515]: 2025-12-06 08:45:19.659314737 +0000 UTC m=+0.191853152 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64) Dec 6 03:45:19 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:45:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:45:21 localhost podman[71565]: 2025-12-06 08:45:21.54844351 +0000 UTC m=+0.083976727 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Dec 6 03:45:21 localhost podman[71565]: 2025-12-06 08:45:21.56104936 +0000 UTC m=+0.096582587 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, tcib_managed=true, release=1761123044, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 6 03:45:21 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:45:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:45:23 localhost systemd[1]: tmp-crun.0o0jOt.mount: Deactivated successfully. Dec 6 03:45:23 localhost podman[71588]: 2025-12-06 08:45:23.534677355 +0000 UTC m=+0.072398640 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.4, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z) Dec 6 03:45:23 localhost podman[71588]: 2025-12-06 08:45:23.569010986 +0000 UTC m=+0.106732261 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:45:23 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:45:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:45:26 localhost podman[71607]: 2025-12-06 08:45:26.543024879 +0000 UTC m=+0.078505788 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:45:26 localhost podman[71607]: 2025-12-06 08:45:26.574324396 +0000 UTC m=+0.109805295 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:45:26 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:45:44 localhost systemd[1]: tmp-crun.XAFqFV.mount: Deactivated successfully. Dec 6 03:45:44 localhost podman[71637]: 2025-12-06 08:45:44.615446143 +0000 UTC m=+0.133700234 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:45:44 localhost podman[71634]: 2025-12-06 08:45:44.581459602 +0000 UTC m=+0.108073751 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container) Dec 6 03:45:44 localhost podman[71634]: 2025-12-06 08:45:44.660702302 +0000 UTC m=+0.187316441 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, release=1761123044, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 03:45:44 localhost podman[71636]: 2025-12-06 08:45:44.672064973 +0000 UTC m=+0.193093990 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Dec 6 03:45:44 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:45:44 localhost podman[71636]: 2025-12-06 08:45:44.695225 +0000 UTC m=+0.216254007 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z) Dec 6 03:45:44 localhost podman[71635]: 2025-12-06 08:45:44.703699712 +0000 UTC m=+0.227561356 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:45:44 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:45:44 localhost podman[71635]: 2025-12-06 08:45:44.725814425 +0000 UTC m=+0.249676069 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:45:44 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:45:44 localhost podman[71637]: 2025-12-06 08:45:44.804063745 +0000 UTC m=+0.322317816 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1) Dec 6 03:45:44 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:45:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:45:46 localhost podman[71734]: 2025-12-06 08:45:46.545230613 +0000 UTC m=+0.078635892 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:45:46 localhost podman[71734]: 2025-12-06 08:45:46.919262027 +0000 UTC m=+0.452667316 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:45:46 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:45:50 localhost systemd[1]: tmp-crun.WkR9YZ.mount: Deactivated successfully. Dec 6 03:45:50 localhost podman[71771]: 2025-12-06 08:45:50.554305575 +0000 UTC m=+0.086540266 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_controller, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:45:50 localhost systemd[1]: tmp-crun.hjwThO.mount: Deactivated successfully. Dec 6 03:45:50 localhost podman[71770]: 2025-12-06 08:45:50.584236271 +0000 UTC m=+0.119250428 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z) Dec 6 03:45:50 localhost podman[71771]: 2025-12-06 08:45:50.607289973 +0000 UTC m=+0.139524684 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z) Dec 6 03:45:50 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:45:50 localhost podman[71770]: 2025-12-06 08:45:50.641331575 +0000 UTC m=+0.176345742 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public) Dec 6 03:45:50 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:45:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:45:52 localhost podman[71816]: 2025-12-06 08:45:52.52727865 +0000 UTC m=+0.061009587 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Dec 6 03:45:52 localhost podman[71816]: 2025-12-06 08:45:52.539148357 +0000 UTC m=+0.072879314 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, architecture=x86_64) Dec 6 03:45:52 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:45:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:45:54 localhost podman[71837]: 2025-12-06 08:45:54.545468364 +0000 UTC m=+0.080933933 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid) Dec 6 03:45:54 localhost podman[71837]: 2025-12-06 08:45:54.555424941 +0000 UTC m=+0.090890520 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 6 03:45:54 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:45:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:45:57 localhost podman[71856]: 2025-12-06 08:45:57.54250225 +0000 UTC m=+0.078703194 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, config_id=tripleo_step5, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 03:45:57 localhost podman[71856]: 2025-12-06 08:45:57.596177049 +0000 UTC m=+0.132377983 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:45:57 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:46:15 localhost podman[71885]: 2025-12-06 08:46:15.551816853 +0000 UTC m=+0.072540232 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:46:15 localhost podman[71886]: 2025-12-06 08:46:15.575481156 +0000 UTC m=+0.093496462 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr) Dec 6 03:46:15 localhost podman[71883]: 2025-12-06 08:46:15.609024023 +0000 UTC m=+0.135276853 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, tcib_managed=true, name=rhosp17/openstack-cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 6 03:46:15 localhost podman[71885]: 2025-12-06 08:46:15.63773041 +0000 UTC m=+0.158453759 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:46:15 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:46:15 localhost podman[71883]: 2025-12-06 08:46:15.692822483 +0000 UTC m=+0.219075393 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 6 03:46:15 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:46:15 localhost podman[71886]: 2025-12-06 08:46:15.761501556 +0000 UTC m=+0.279516912 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, distribution-scope=public) Dec 6 03:46:15 localhost podman[71884]: 2025-12-06 08:46:15.771996751 +0000 UTC m=+0.297376105 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com) Dec 6 03:46:15 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:46:15 localhost podman[71884]: 2025-12-06 08:46:15.827356682 +0000 UTC m=+0.352735976 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ceilometer_agent_ipmi, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 6 03:46:15 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:46:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:46:17 localhost systemd[1]: tmp-crun.EOYNfh.mount: Deactivated successfully. Dec 6 03:46:17 localhost podman[71985]: 2025-12-06 08:46:17.554984802 +0000 UTC m=+0.081859112 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:46:17 localhost podman[71985]: 2025-12-06 08:46:17.921340928 +0000 UTC m=+0.448215208 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Dec 6 03:46:17 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:46:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:46:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:46:21 localhost systemd[1]: tmp-crun.pKqU1E.mount: Deactivated successfully. Dec 6 03:46:21 localhost podman[72009]: 2025-12-06 08:46:21.567136159 +0000 UTC m=+0.094889024 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Dec 6 03:46:21 localhost podman[72010]: 2025-12-06 08:46:21.621471179 +0000 UTC m=+0.143784186 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team) Dec 6 03:46:21 localhost podman[72009]: 2025-12-06 08:46:21.626012169 +0000 UTC m=+0.153765034 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044) Dec 6 03:46:21 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:46:21 localhost podman[72010]: 2025-12-06 08:46:21.667777071 +0000 UTC m=+0.190090058 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_controller, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container) Dec 6 03:46:21 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:46:22 localhost systemd[1]: tmp-crun.22WBZc.mount: Deactivated successfully. Dec 6 03:46:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:46:22 localhost systemd[1]: tmp-crun.CL9RPg.mount: Deactivated successfully. Dec 6 03:46:22 localhost podman[72056]: 2025-12-06 08:46:22.654115924 +0000 UTC m=+0.070692777 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-collectd, vcs-type=git) Dec 6 03:46:22 localhost podman[72056]: 2025-12-06 08:46:22.661384588 +0000 UTC m=+0.077961441 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:46:22 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:46:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:46:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:46:25 localhost recover_tripleo_nova_virtqemud[72077]: 51836 Dec 6 03:46:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:46:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:46:25 localhost podman[72075]: 2025-12-06 08:46:25.550582798 +0000 UTC m=+0.077904459 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z) Dec 6 03:46:25 localhost podman[72075]: 2025-12-06 08:46:25.56133174 +0000 UTC m=+0.088653401 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:46:25 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:46:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:46:28 localhost systemd[1]: tmp-crun.rJpudi.mount: Deactivated successfully. Dec 6 03:46:28 localhost podman[72096]: 2025-12-06 08:46:28.552055319 +0000 UTC m=+0.082217832 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, name=rhosp17/openstack-nova-compute, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Dec 6 03:46:28 localhost podman[72096]: 2025-12-06 08:46:28.597524914 +0000 UTC m=+0.127687407 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:46:28 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:46:46 localhost systemd[1]: tmp-crun.S9Gg2r.mount: Deactivated successfully. Dec 6 03:46:46 localhost podman[72124]: 2025-12-06 08:46:46.571113627 +0000 UTC m=+0.094843113 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, vcs-type=git, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, version=17.1.12, container_name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 03:46:46 localhost podman[72124]: 2025-12-06 08:46:46.605750149 +0000 UTC m=+0.129479615 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12) Dec 6 03:46:46 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:46:46 localhost podman[72126]: 2025-12-06 08:46:46.626125968 +0000 UTC m=+0.147506371 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container) Dec 6 03:46:46 localhost podman[72126]: 2025-12-06 08:46:46.659227762 +0000 UTC m=+0.180608205 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z) Dec 6 03:46:46 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:46:46 localhost podman[72125]: 2025-12-06 08:46:46.669986004 +0000 UTC m=+0.190699077 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true) Dec 6 03:46:46 localhost podman[72127]: 2025-12-06 08:46:46.589381592 +0000 UTC m=+0.103436909 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc.) Dec 6 03:46:46 localhost podman[72125]: 2025-12-06 08:46:46.696176214 +0000 UTC m=+0.216889267 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true) Dec 6 03:46:46 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:46:46 localhost podman[72127]: 2025-12-06 08:46:46.795465803 +0000 UTC m=+0.309521180 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Dec 6 03:46:46 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:46:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:46:48 localhost systemd[1]: tmp-crun.CAvm4s.mount: Deactivated successfully. Dec 6 03:46:48 localhost podman[72224]: 2025-12-06 08:46:48.562943075 +0000 UTC m=+0.095383550 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Dec 6 03:46:48 localhost podman[72224]: 2025-12-06 08:46:48.961090754 +0000 UTC m=+0.493531239 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc.) Dec 6 03:46:48 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:46:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:46:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:46:52 localhost podman[72260]: 2025-12-06 08:46:52.548749478 +0000 UTC m=+0.083396279 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:46:52 localhost systemd[1]: tmp-crun.6oe20x.mount: Deactivated successfully. Dec 6 03:46:52 localhost podman[72260]: 2025-12-06 08:46:52.616994647 +0000 UTC m=+0.151641448 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:46:52 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:46:52 localhost podman[72261]: 2025-12-06 08:46:52.618877076 +0000 UTC m=+0.150819844 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:46:52 localhost podman[72261]: 2025-12-06 08:46:52.699240441 +0000 UTC m=+0.231183219 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:46:52 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:46:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:46:52 localhost podman[72308]: 2025-12-06 08:46:52.836730761 +0000 UTC m=+0.089569090 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 6 03:46:52 localhost podman[72308]: 2025-12-06 08:46:52.869655639 +0000 UTC m=+0.122493888 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container) Dec 6 03:46:52 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:46:53 localhost systemd[1]: tmp-crun.2tJxiI.mount: Deactivated successfully. Dec 6 03:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:46:56 localhost podman[72328]: 2025-12-06 08:46:56.546968543 +0000 UTC m=+0.078564390 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid) Dec 6 03:46:56 localhost podman[72328]: 2025-12-06 08:46:56.559614694 +0000 UTC m=+0.091210571 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, url=https://www.redhat.com) Dec 6 03:46:56 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:46:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:46:59 localhost podman[72346]: 2025-12-06 08:46:59.553030087 +0000 UTC m=+0.086407263 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.) Dec 6 03:46:59 localhost podman[72346]: 2025-12-06 08:46:59.585305905 +0000 UTC m=+0.118683081 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:46:59 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:47:17 localhost systemd[1]: tmp-crun.SXPUFe.mount: Deactivated successfully. Dec 6 03:47:17 localhost podman[72371]: 2025-12-06 08:47:17.573530574 +0000 UTC m=+0.104955005 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z) Dec 6 03:47:17 localhost podman[72371]: 2025-12-06 08:47:17.57955257 +0000 UTC m=+0.110977061 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, batch=17.1_20251118.1) Dec 6 03:47:17 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:47:17 localhost podman[72378]: 2025-12-06 08:47:17.667305393 +0000 UTC m=+0.189483428 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git) Dec 6 03:47:17 localhost podman[72373]: 2025-12-06 08:47:17.722774558 +0000 UTC m=+0.246880703 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 6 03:47:17 localhost podman[72373]: 2025-12-06 08:47:17.76132043 +0000 UTC m=+0.285426565 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:47:17 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:47:17 localhost podman[72372]: 2025-12-06 08:47:17.76873777 +0000 UTC m=+0.295644792 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4) Dec 6 03:47:17 localhost podman[72372]: 2025-12-06 08:47:17.849483466 +0000 UTC m=+0.376390498 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044) Dec 6 03:47:17 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:47:17 localhost podman[72378]: 2025-12-06 08:47:17.885280822 +0000 UTC m=+0.407458837 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:47:17 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:47:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:47:19 localhost systemd[1]: tmp-crun.aPivY0.mount: Deactivated successfully. Dec 6 03:47:19 localhost podman[72473]: 2025-12-06 08:47:19.554356012 +0000 UTC m=+0.089247920 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4) Dec 6 03:47:19 localhost podman[72473]: 2025-12-06 08:47:19.930225442 +0000 UTC m=+0.465117320 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:47:19 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:47:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:47:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:47:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:47:23 localhost podman[72497]: 2025-12-06 08:47:23.554342493 +0000 UTC m=+0.084593676 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, tcib_managed=true, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:47:23 localhost podman[72497]: 2025-12-06 08:47:23.591157811 +0000 UTC m=+0.121409024 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 6 03:47:23 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:47:23 localhost podman[72499]: 2025-12-06 08:47:23.621071307 +0000 UTC m=+0.145920793 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:47:23 localhost podman[72498]: 2025-12-06 08:47:23.573990051 +0000 UTC m=+0.100900201 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 6 03:47:23 localhost podman[72498]: 2025-12-06 08:47:23.655122479 +0000 UTC m=+0.182032539 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true) Dec 6 03:47:23 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:47:23 localhost podman[72499]: 2025-12-06 08:47:23.668624657 +0000 UTC m=+0.193474103 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container) Dec 6 03:47:23 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:47:24 localhost systemd[1]: tmp-crun.WjdNcu.mount: Deactivated successfully. Dec 6 03:47:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:47:27 localhost podman[72563]: 2025-12-06 08:47:27.516451654 +0000 UTC m=+0.058559142 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-iscsid-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Dec 6 03:47:27 localhost podman[72563]: 2025-12-06 08:47:27.55123677 +0000 UTC m=+0.093344208 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044) Dec 6 03:47:27 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:47:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:47:30 localhost podman[72582]: 2025-12-06 08:47:30.548827322 +0000 UTC m=+0.080636904 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:47:30 localhost podman[72582]: 2025-12-06 08:47:30.604364549 +0000 UTC m=+0.136174071 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, config_id=tripleo_step5, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:47:30 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:47:48 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:47:48 localhost recover_tripleo_nova_virtqemud[72629]: 51836 Dec 6 03:47:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:47:48 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:47:48 localhost podman[72611]: 2025-12-06 08:47:48.562011135 +0000 UTC m=+0.082257385 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute) Dec 6 03:47:48 localhost systemd[1]: tmp-crun.MC8yrS.mount: Deactivated successfully. Dec 6 03:47:48 localhost podman[72610]: 2025-12-06 08:47:48.62364237 +0000 UTC m=+0.150466463 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:47:48 localhost podman[72617]: 2025-12-06 08:47:48.587221775 +0000 UTC m=+0.105847484 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, vcs-type=git, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:47:48 localhost podman[72611]: 2025-12-06 08:47:48.651262034 +0000 UTC m=+0.171508224 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=) Dec 6 03:47:48 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:47:48 localhost podman[72609]: 2025-12-06 08:47:48.66699726 +0000 UTC m=+0.197382582 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true) Dec 6 03:47:48 localhost podman[72609]: 2025-12-06 08:47:48.67377507 +0000 UTC m=+0.204160372 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, release=1761123044, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public) Dec 6 03:47:48 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:47:48 localhost podman[72610]: 2025-12-06 08:47:48.707216754 +0000 UTC m=+0.234040797 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true) Dec 6 03:47:48 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:47:48 localhost podman[72617]: 2025-12-06 08:47:48.76626618 +0000 UTC m=+0.284891899 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-type=git) Dec 6 03:47:48 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:47:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:47:50 localhost podman[72711]: 2025-12-06 08:47:50.545469804 +0000 UTC m=+0.077373734 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:47:50 localhost podman[72711]: 2025-12-06 08:47:50.922271692 +0000 UTC m=+0.454175562 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, release=1761123044, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:47:50 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:47:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:47:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:47:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:47:54 localhost podman[72746]: 2025-12-06 08:47:54.557344033 +0000 UTC m=+0.079553350 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git) Dec 6 03:47:54 localhost podman[72746]: 2025-12-06 08:47:54.569153448 +0000 UTC m=+0.091362735 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:47:54 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:47:54 localhost systemd[1]: tmp-crun.6gA0M6.mount: Deactivated successfully. Dec 6 03:47:54 localhost systemd[1]: tmp-crun.i9ac8F.mount: Deactivated successfully. Dec 6 03:47:54 localhost podman[72747]: 2025-12-06 08:47:54.722068866 +0000 UTC m=+0.241544468 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, release=1761123044) Dec 6 03:47:54 localhost podman[72748]: 2025-12-06 08:47:54.686442744 +0000 UTC m=+0.201834450 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044) Dec 6 03:47:54 localhost podman[72748]: 2025-12-06 08:47:54.765968713 +0000 UTC m=+0.281360389 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., container_name=ovn_controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:47:54 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:47:54 localhost podman[72747]: 2025-12-06 08:47:54.798455517 +0000 UTC m=+0.317931179 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Dec 6 03:47:54 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:47:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:47:58 localhost podman[72815]: 2025-12-06 08:47:58.551040131 +0000 UTC m=+0.084885646 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:47:58 localhost podman[72815]: 2025-12-06 08:47:58.560173933 +0000 UTC m=+0.094019448 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Dec 6 03:47:58 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:48:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:48:01 localhost podman[72834]: 2025-12-06 08:48:01.585579904 +0000 UTC m=+0.114310905 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:48:01 localhost podman[72834]: 2025-12-06 08:48:01.616242182 +0000 UTC m=+0.144973133 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Dec 6 03:48:01 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:48:19 localhost podman[72860]: 2025-12-06 08:48:19.533447997 +0000 UTC m=+0.070238533 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, version=17.1.12, container_name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:48:19 localhost podman[72861]: 2025-12-06 08:48:19.544815319 +0000 UTC m=+0.076601980 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, container_name=ceilometer_agent_ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true) Dec 6 03:48:19 localhost podman[72860]: 2025-12-06 08:48:19.571220085 +0000 UTC m=+0.108010551 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron) Dec 6 03:48:19 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:48:19 localhost podman[72864]: 2025-12-06 08:48:19.58337378 +0000 UTC m=+0.113799418 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:48:19 localhost podman[72861]: 2025-12-06 08:48:19.593250216 +0000 UTC m=+0.125036837 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com) Dec 6 03:48:19 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:48:19 localhost podman[72862]: 2025-12-06 08:48:19.65677192 +0000 UTC m=+0.186126866 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:48:19 localhost podman[72862]: 2025-12-06 08:48:19.675264881 +0000 UTC m=+0.204619817 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team) Dec 6 03:48:19 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:48:19 localhost podman[72864]: 2025-12-06 08:48:19.827826118 +0000 UTC m=+0.358251746 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:48:19 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:48:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:48:21 localhost podman[72960]: 2025-12-06 08:48:21.548323898 +0000 UTC m=+0.084003168 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc.) Dec 6 03:48:21 localhost podman[72960]: 2025-12-06 08:48:21.935888569 +0000 UTC m=+0.471567819 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:48:21 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:48:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:48:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:48:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:48:25 localhost podman[72982]: 2025-12-06 08:48:25.553242913 +0000 UTC m=+0.087601549 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, container_name=collectd, tcib_managed=true, vcs-type=git) Dec 6 03:48:25 localhost podman[72982]: 2025-12-06 08:48:25.592420754 +0000 UTC m=+0.126779370 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible) Dec 6 03:48:25 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:48:25 localhost systemd[1]: tmp-crun.xWjimT.mount: Deactivated successfully. Dec 6 03:48:25 localhost podman[72984]: 2025-12-06 08:48:25.657236768 +0000 UTC m=+0.186792117 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public) Dec 6 03:48:25 localhost podman[72983]: 2025-12-06 08:48:25.628968354 +0000 UTC m=+0.159210543 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, distribution-scope=public) Dec 6 03:48:25 localhost podman[72984]: 2025-12-06 08:48:25.70613868 +0000 UTC m=+0.235694028 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:48:25 localhost podman[72983]: 2025-12-06 08:48:25.715322434 +0000 UTC m=+0.245564583 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:48:25 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:48:25 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:48:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:48:29 localhost podman[73047]: 2025-12-06 08:48:29.547784526 +0000 UTC m=+0.083253724 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044) Dec 6 03:48:29 localhost podman[73047]: 2025-12-06 08:48:29.556191376 +0000 UTC m=+0.091660574 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Dec 6 03:48:29 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:48:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:48:32 localhost podman[73067]: 2025-12-06 08:48:32.53505976 +0000 UTC m=+0.070402338 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, vcs-type=git, container_name=nova_compute, version=17.1.12, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com) Dec 6 03:48:32 localhost podman[73067]: 2025-12-06 08:48:32.567228314 +0000 UTC m=+0.102570882 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:48:32 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:48:50 localhost systemd[1]: tmp-crun.74qZmy.mount: Deactivated successfully. Dec 6 03:48:50 localhost podman[73093]: 2025-12-06 08:48:50.554380573 +0000 UTC m=+0.079842820 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, container_name=logrotate_crond, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Dec 6 03:48:50 localhost podman[73093]: 2025-12-06 08:48:50.593569974 +0000 UTC m=+0.119032241 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, release=1761123044, version=17.1.12, container_name=logrotate_crond, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 6 03:48:50 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:48:50 localhost systemd[1]: tmp-crun.kcNBuv.mount: Deactivated successfully. Dec 6 03:48:50 localhost podman[73095]: 2025-12-06 08:48:50.676033653 +0000 UTC m=+0.192285165 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute) Dec 6 03:48:50 localhost podman[73101]: 2025-12-06 08:48:50.633904111 +0000 UTC m=+0.145461398 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible) Dec 6 03:48:50 localhost podman[73094]: 2025-12-06 08:48:50.717372762 +0000 UTC m=+0.235775230 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:48:50 localhost podman[73095]: 2025-12-06 08:48:50.739328531 +0000 UTC m=+0.255580053 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=) Dec 6 03:48:50 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:48:50 localhost podman[73094]: 2025-12-06 08:48:50.80466582 +0000 UTC m=+0.323068248 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 6 03:48:50 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:48:50 localhost podman[73101]: 2025-12-06 08:48:50.830272742 +0000 UTC m=+0.341830069 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:48:50 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:48:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:48:52 localhost podman[73195]: 2025-12-06 08:48:52.545780397 +0000 UTC m=+0.081475179 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Dec 6 03:48:52 localhost podman[73195]: 2025-12-06 08:48:52.915185618 +0000 UTC m=+0.450880340 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, release=1761123044, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com) Dec 6 03:48:52 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:48:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:48:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:48:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:48:56 localhost podman[73220]: 2025-12-06 08:48:56.553657246 +0000 UTC m=+0.078097516 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Dec 6 03:48:56 localhost podman[73220]: 2025-12-06 08:48:56.596211423 +0000 UTC m=+0.120651683 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Dec 6 03:48:56 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:48:56 localhost podman[73219]: 2025-12-06 08:48:56.615932903 +0000 UTC m=+0.141631773 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:48:56 localhost podman[73219]: 2025-12-06 08:48:56.624388145 +0000 UTC m=+0.150087055 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:48:56 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:48:56 localhost podman[73221]: 2025-12-06 08:48:56.719722624 +0000 UTC m=+0.237459427 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container) Dec 6 03:48:56 localhost podman[73221]: 2025-12-06 08:48:56.769336739 +0000 UTC m=+0.287073532 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:48:56 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:49:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:49:00 localhost podman[73287]: 2025-12-06 08:49:00.550520911 +0000 UTC m=+0.085092053 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Dec 6 03:49:00 localhost podman[73287]: 2025-12-06 08:49:00.557874839 +0000 UTC m=+0.092445941 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com) Dec 6 03:49:00 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:49:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:49:03 localhost podman[73306]: 2025-12-06 08:49:03.551997091 +0000 UTC m=+0.083102182 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_compute, release=1761123044, com.redhat.component=openstack-nova-compute-container) Dec 6 03:49:03 localhost podman[73306]: 2025-12-06 08:49:03.586448587 +0000 UTC m=+0.117553678 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64) Dec 6 03:49:03 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:49:21 localhost systemd[1]: tmp-crun.0Fit14.mount: Deactivated successfully. Dec 6 03:49:21 localhost podman[73333]: 2025-12-06 08:49:21.601295533 +0000 UTC m=+0.132224803 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Dec 6 03:49:21 localhost podman[73334]: 2025-12-06 08:49:21.562489912 +0000 UTC m=+0.087464928 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044) Dec 6 03:49:21 localhost podman[73336]: 2025-12-06 08:49:21.580315723 +0000 UTC m=+0.101338556 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z) Dec 6 03:49:21 localhost podman[73333]: 2025-12-06 08:49:21.646489461 +0000 UTC m=+0.177418721 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4) Dec 6 03:49:21 localhost podman[73334]: 2025-12-06 08:49:21.646305315 +0000 UTC m=+0.171280301 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:49:21 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:49:21 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:49:21 localhost podman[73332]: 2025-12-06 08:49:21.7014329 +0000 UTC m=+0.234167265 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container) Dec 6 03:49:21 localhost podman[73332]: 2025-12-06 08:49:21.781707914 +0000 UTC m=+0.314442309 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Dec 6 03:49:21 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:49:21 localhost podman[73336]: 2025-12-06 08:49:21.81035034 +0000 UTC m=+0.331373213 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044) Dec 6 03:49:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:49:22 localhost systemd[1]: tmp-crun.7JPaPc.mount: Deactivated successfully. Dec 6 03:49:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:49:23 localhost podman[73433]: 2025-12-06 08:49:23.55092361 +0000 UTC m=+0.085925779 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 6 03:49:23 localhost podman[73433]: 2025-12-06 08:49:23.939927755 +0000 UTC m=+0.474929964 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z) Dec 6 03:49:23 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:49:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:49:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:49:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:49:27 localhost systemd[1]: tmp-crun.LVkZbU.mount: Deactivated successfully. Dec 6 03:49:27 localhost podman[73457]: 2025-12-06 08:49:27.569107426 +0000 UTC m=+0.091975867 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=) Dec 6 03:49:27 localhost podman[73458]: 2025-12-06 08:49:27.623950052 +0000 UTC m=+0.139794085 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, version=17.1.12, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:49:27 localhost podman[73457]: 2025-12-06 08:49:27.634160058 +0000 UTC m=+0.157028439 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:49:27 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:49:27 localhost podman[73458]: 2025-12-06 08:49:27.652107403 +0000 UTC m=+0.167951426 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:49:27 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:49:27 localhost podman[73456]: 2025-12-06 08:49:27.710160709 +0000 UTC m=+0.236211308 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, version=17.1.12, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1) Dec 6 03:49:27 localhost podman[73456]: 2025-12-06 08:49:27.719269671 +0000 UTC m=+0.245320270 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Dec 6 03:49:27 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:49:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:49:31 localhost podman[73523]: 2025-12-06 08:49:31.549274655 +0000 UTC m=+0.080807931 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:49:31 localhost podman[73523]: 2025-12-06 08:49:31.587382613 +0000 UTC m=+0.118915889 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:49:31 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:49:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:49:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:49:34 localhost recover_tripleo_nova_virtqemud[73549]: 51836 Dec 6 03:49:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:49:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:49:34 localhost podman[73542]: 2025-12-06 08:49:34.542812988 +0000 UTC m=+0.079166740 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:49:34 localhost podman[73542]: 2025-12-06 08:49:34.573369904 +0000 UTC m=+0.109723696 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12) Dec 6 03:49:34 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:49:52 localhost podman[73574]: 2025-12-06 08:49:52.554730224 +0000 UTC m=+0.076201829 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Dec 6 03:49:52 localhost systemd[1]: tmp-crun.XgeqqL.mount: Deactivated successfully. Dec 6 03:49:52 localhost podman[73572]: 2025-12-06 08:49:52.602724958 +0000 UTC m=+0.125545845 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Dec 6 03:49:52 localhost podman[73571]: 2025-12-06 08:49:52.66452869 +0000 UTC m=+0.190229746 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:49:52 localhost podman[73572]: 2025-12-06 08:49:52.676545452 +0000 UTC m=+0.199366429 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 03:49:52 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:49:52 localhost podman[73571]: 2025-12-06 08:49:52.696510519 +0000 UTC m=+0.222211555 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Dec 6 03:49:52 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:49:52 localhost podman[73573]: 2025-12-06 08:49:52.748682764 +0000 UTC m=+0.268563560 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.openshift.expose-services=) Dec 6 03:49:52 localhost podman[73574]: 2025-12-06 08:49:52.755944739 +0000 UTC m=+0.277416354 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc.) Dec 6 03:49:52 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:49:52 localhost podman[73573]: 2025-12-06 08:49:52.816744709 +0000 UTC m=+0.336625495 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute) Dec 6 03:49:52 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:49:53 localhost systemd[1]: tmp-crun.ScmQnL.mount: Deactivated successfully. Dec 6 03:49:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:49:54 localhost podman[73666]: 2025-12-06 08:49:54.548124555 +0000 UTC m=+0.083769362 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4) Dec 6 03:49:54 localhost podman[73666]: 2025-12-06 08:49:54.92160719 +0000 UTC m=+0.457252027 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4) Dec 6 03:49:54 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:49:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:49:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:49:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:49:58 localhost podman[73690]: 2025-12-06 08:49:58.551068798 +0000 UTC m=+0.081271325 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, version=17.1.12) Dec 6 03:49:58 localhost podman[73690]: 2025-12-06 08:49:58.597561087 +0000 UTC m=+0.127763634 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1761123044) Dec 6 03:49:58 localhost systemd[1]: tmp-crun.LG6VH6.mount: Deactivated successfully. Dec 6 03:49:58 localhost podman[73689]: 2025-12-06 08:49:58.617924357 +0000 UTC m=+0.151130117 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=1761123044, version=17.1.12) Dec 6 03:49:58 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:49:58 localhost podman[73691]: 2025-12-06 08:49:58.662991761 +0000 UTC m=+0.188510073 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:49:58 localhost podman[73689]: 2025-12-06 08:49:58.68559136 +0000 UTC m=+0.218797150 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=) Dec 6 03:49:58 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:49:58 localhost podman[73691]: 2025-12-06 08:49:58.715265828 +0000 UTC m=+0.240784110 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:49:58 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:50:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:50:02 localhost systemd[1]: tmp-crun.m9IPh7.mount: Deactivated successfully. Dec 6 03:50:02 localhost podman[73756]: 2025-12-06 08:50:02.541840085 +0000 UTC m=+0.076929402 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Dec 6 03:50:02 localhost podman[73756]: 2025-12-06 08:50:02.575249918 +0000 UTC m=+0.110339265 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:50:02 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:50:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:50:05 localhost podman[73775]: 2025-12-06 08:50:05.538602709 +0000 UTC m=+0.074427334 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public) Dec 6 03:50:05 localhost podman[73775]: 2025-12-06 08:50:05.567320397 +0000 UTC m=+0.103145052 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git) Dec 6 03:50:05 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:50:23 localhost systemd[1]: tmp-crun.MaYAO3.mount: Deactivated successfully. Dec 6 03:50:23 localhost podman[73802]: 2025-12-06 08:50:23.567096057 +0000 UTC m=+0.094326539 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ceilometer_agent_ipmi, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:50:23 localhost podman[73802]: 2025-12-06 08:50:23.59307443 +0000 UTC m=+0.120304912 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Dec 6 03:50:23 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:50:23 localhost podman[73803]: 2025-12-06 08:50:23.540170344 +0000 UTC m=+0.067930593 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, release=1761123044, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container) Dec 6 03:50:23 localhost podman[73801]: 2025-12-06 08:50:23.653553462 +0000 UTC m=+0.184493579 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com) Dec 6 03:50:23 localhost podman[73804]: 2025-12-06 08:50:23.670937499 +0000 UTC m=+0.192406143 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:50:23 localhost podman[73803]: 2025-12-06 08:50:23.675422939 +0000 UTC m=+0.203183258 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:50:23 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:50:23 localhost podman[73801]: 2025-12-06 08:50:23.688535054 +0000 UTC m=+0.219475101 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044) Dec 6 03:50:23 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:50:23 localhost podman[73804]: 2025-12-06 08:50:23.894355162 +0000 UTC m=+0.415823786 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:50:23 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:50:24 localhost systemd[1]: tmp-crun.WpKLvb.mount: Deactivated successfully. Dec 6 03:50:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:50:25 localhost systemd[1]: tmp-crun.IAP0R8.mount: Deactivated successfully. Dec 6 03:50:25 localhost podman[73900]: 2025-12-06 08:50:25.556934808 +0000 UTC m=+0.091621315 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_migration_target, release=1761123044, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Dec 6 03:50:25 localhost podman[73900]: 2025-12-06 08:50:25.947577294 +0000 UTC m=+0.482263851 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=) Dec 6 03:50:25 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:50:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:50:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:50:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:50:29 localhost systemd[1]: tmp-crun.0PMDgx.mount: Deactivated successfully. Dec 6 03:50:29 localhost podman[73925]: 2025-12-06 08:50:29.57142631 +0000 UTC m=+0.088587182 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 6 03:50:29 localhost systemd[1]: tmp-crun.Lo0bB7.mount: Deactivated successfully. Dec 6 03:50:29 localhost podman[73924]: 2025-12-06 08:50:29.628369831 +0000 UTC m=+0.149357431 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Dec 6 03:50:29 localhost podman[73923]: 2025-12-06 08:50:29.660810054 +0000 UTC m=+0.182489096 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Dec 6 03:50:29 localhost podman[73923]: 2025-12-06 08:50:29.667904144 +0000 UTC m=+0.189583186 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:51:28Z) Dec 6 03:50:29 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:50:29 localhost podman[73925]: 2025-12-06 08:50:29.69589088 +0000 UTC m=+0.213051772 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Dec 6 03:50:29 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:50:29 localhost podman[73924]: 2025-12-06 08:50:29.714769624 +0000 UTC m=+0.235757164 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:50:29 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:50:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:50:33 localhost systemd[1]: tmp-crun.y3sJRs.mount: Deactivated successfully. Dec 6 03:50:33 localhost podman[73990]: 2025-12-06 08:50:33.547182172 +0000 UTC m=+0.079444400 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible) Dec 6 03:50:33 localhost podman[73990]: 2025-12-06 08:50:33.55715688 +0000 UTC m=+0.089419038 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:50:33 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:50:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:50:36 localhost podman[74009]: 2025-12-06 08:50:36.549608039 +0000 UTC m=+0.085890747 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 6 03:50:36 localhost podman[74009]: 2025-12-06 08:50:36.577262775 +0000 UTC m=+0.113545543 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:50:36 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:50:48 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:50:48 localhost recover_tripleo_nova_virtqemud[74036]: 51836 Dec 6 03:50:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:50:48 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:50:54 localhost systemd[1]: tmp-crun.95gbsy.mount: Deactivated successfully. Dec 6 03:50:54 localhost podman[74037]: 2025-12-06 08:50:54.564173278 +0000 UTC m=+0.091998728 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:50:54 localhost podman[74037]: 2025-12-06 08:50:54.597312903 +0000 UTC m=+0.125138273 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:50:54 localhost systemd[1]: tmp-crun.KLzBRR.mount: Deactivated successfully. Dec 6 03:50:54 localhost podman[74039]: 2025-12-06 08:50:54.610628535 +0000 UTC m=+0.133470250 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 6 03:50:54 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:50:54 localhost podman[74040]: 2025-12-06 08:50:54.665310987 +0000 UTC m=+0.183880820 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:50:54 localhost podman[74039]: 2025-12-06 08:50:54.717507252 +0000 UTC m=+0.240348887 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Dec 6 03:50:54 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:50:54 localhost podman[74038]: 2025-12-06 08:50:54.720476524 +0000 UTC m=+0.246959592 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:50:54 localhost podman[74038]: 2025-12-06 08:50:54.806349911 +0000 UTC m=+0.332833029 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:50:54 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:50:54 localhost podman[74040]: 2025-12-06 08:50:54.875229301 +0000 UTC m=+0.393799184 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}) Dec 6 03:50:54 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:50:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:50:56 localhost podman[74139]: 2025-12-06 08:50:56.555385201 +0000 UTC m=+0.082968188 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:50:56 localhost podman[74139]: 2025-12-06 08:50:56.973306901 +0000 UTC m=+0.500889878 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Dec 6 03:50:56 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:51:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:51:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:51:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:51:00 localhost podman[74163]: 2025-12-06 08:51:00.549795502 +0000 UTC m=+0.084546787 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step3, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd) Dec 6 03:51:00 localhost podman[74163]: 2025-12-06 08:51:00.586157427 +0000 UTC m=+0.120908642 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Dec 6 03:51:00 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:51:00 localhost podman[74165]: 2025-12-06 08:51:00.605047771 +0000 UTC m=+0.131049625 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Dec 6 03:51:00 localhost podman[74164]: 2025-12-06 08:51:00.654006396 +0000 UTC m=+0.183024873 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, container_name=ovn_metadata_agent, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:51:00 localhost podman[74165]: 2025-12-06 08:51:00.682444285 +0000 UTC m=+0.208446069 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller) Dec 6 03:51:00 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:51:00 localhost podman[74164]: 2025-12-06 08:51:00.724151076 +0000 UTC m=+0.253169583 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 6 03:51:00 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:51:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:51:04 localhost podman[74230]: 2025-12-06 08:51:04.541948232 +0000 UTC m=+0.077245631 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container) Dec 6 03:51:04 localhost podman[74230]: 2025-12-06 08:51:04.554141069 +0000 UTC m=+0.089438448 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Dec 6 03:51:04 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:51:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:51:07 localhost systemd[1]: tmp-crun.SvQsAE.mount: Deactivated successfully. Dec 6 03:51:07 localhost podman[74249]: 2025-12-06 08:51:07.553614928 +0000 UTC m=+0.085549958 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step5, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:51:07 localhost podman[74249]: 2025-12-06 08:51:07.588974211 +0000 UTC m=+0.120909221 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:51:07 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:51:25 localhost systemd[1]: tmp-crun.1JXid7.mount: Deactivated successfully. Dec 6 03:51:25 localhost podman[74278]: 2025-12-06 08:51:25.577835607 +0000 UTC m=+0.099301194 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git) Dec 6 03:51:25 localhost podman[74277]: 2025-12-06 08:51:25.62060081 +0000 UTC m=+0.144440489 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 6 03:51:25 localhost podman[74278]: 2025-12-06 08:51:25.638258076 +0000 UTC m=+0.159723653 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Dec 6 03:51:25 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:51:25 localhost podman[74277]: 2025-12-06 08:51:25.659422531 +0000 UTC m=+0.183262240 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, version=17.1.12, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=) Dec 6 03:51:25 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:51:25 localhost podman[74283]: 2025-12-06 08:51:25.739560521 +0000 UTC m=+0.255547378 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:51:25 localhost podman[74276]: 2025-12-06 08:51:25.785889173 +0000 UTC m=+0.313305844 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, vcs-type=git, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z) Dec 6 03:51:25 localhost podman[74276]: 2025-12-06 08:51:25.797282286 +0000 UTC m=+0.324698907 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, container_name=logrotate_crond, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:51:25 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:51:25 localhost podman[74283]: 2025-12-06 08:51:25.942350004 +0000 UTC m=+0.458336811 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:51:25 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:51:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:51:27 localhost systemd[1]: tmp-crun.eOaI6C.mount: Deactivated successfully. Dec 6 03:51:27 localhost podman[74378]: 2025-12-06 08:51:27.541267861 +0000 UTC m=+0.076486288 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1) Dec 6 03:51:27 localhost podman[74378]: 2025-12-06 08:51:27.913738335 +0000 UTC m=+0.448956742 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, release=1761123044, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 03:51:27 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:51:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:51:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:51:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:51:31 localhost podman[74401]: 2025-12-06 08:51:31.559540299 +0000 UTC m=+0.093174973 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:51:31 localhost podman[74402]: 2025-12-06 08:51:31.604332515 +0000 UTC m=+0.132263153 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 6 03:51:31 localhost podman[74401]: 2025-12-06 08:51:31.62288571 +0000 UTC m=+0.156520404 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 6 03:51:31 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:51:31 localhost podman[74402]: 2025-12-06 08:51:31.650186283 +0000 UTC m=+0.178116911 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 6 03:51:31 localhost systemd[1]: tmp-crun.0cHBdA.mount: Deactivated successfully. Dec 6 03:51:31 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:51:31 localhost podman[74403]: 2025-12-06 08:51:31.673133394 +0000 UTC m=+0.199658259 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ovn_controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Dec 6 03:51:31 localhost podman[74403]: 2025-12-06 08:51:31.724164943 +0000 UTC m=+0.250689748 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-ovn-controller) Dec 6 03:51:31 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:51:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:51:35 localhost systemd[1]: tmp-crun.z1r1ZW.mount: Deactivated successfully. Dec 6 03:51:35 localhost podman[74470]: 2025-12-06 08:51:35.545074884 +0000 UTC m=+0.079620845 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, distribution-scope=public) Dec 6 03:51:35 localhost podman[74470]: 2025-12-06 08:51:35.554107953 +0000 UTC m=+0.088653914 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:51:35 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:51:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:51:38 localhost podman[74490]: 2025-12-06 08:51:38.527275237 +0000 UTC m=+0.067570742 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:51:38 localhost podman[74490]: 2025-12-06 08:51:38.577293214 +0000 UTC m=+0.117588699 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_id=tripleo_step5, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:51:38 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:51:56 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:51:56 localhost recover_tripleo_nova_virtqemud[74536]: 51836 Dec 6 03:51:56 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:51:56 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:51:56 localhost systemd[1]: tmp-crun.u969Iz.mount: Deactivated successfully. Dec 6 03:51:56 localhost podman[74517]: 2025-12-06 08:51:56.58402879 +0000 UTC m=+0.103530084 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:51:56 localhost podman[74517]: 2025-12-06 08:51:56.634302745 +0000 UTC m=+0.153804029 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 03:51:56 localhost podman[74516]: 2025-12-06 08:51:56.663594571 +0000 UTC m=+0.185518190 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:51:56 localhost podman[74519]: 2025-12-06 08:51:56.709698298 +0000 UTC m=+0.222653890 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Dec 6 03:51:56 localhost podman[74516]: 2025-12-06 08:51:56.744444272 +0000 UTC m=+0.266367861 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:51:56 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:51:56 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:51:56 localhost podman[74518]: 2025-12-06 08:51:56.8293614 +0000 UTC m=+0.344266322 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 6 03:51:56 localhost podman[74518]: 2025-12-06 08:51:56.882285917 +0000 UTC m=+0.397190859 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 6 03:51:56 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:51:56 localhost podman[74519]: 2025-12-06 08:51:56.913248885 +0000 UTC m=+0.426204487 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step1, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:51:56 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:51:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:51:58 localhost systemd[1]: tmp-crun.q9ffai.mount: Deactivated successfully. Dec 6 03:51:58 localhost podman[74619]: 2025-12-06 08:51:58.546255837 +0000 UTC m=+0.079607024 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 03:51:58 localhost podman[74619]: 2025-12-06 08:51:58.94328405 +0000 UTC m=+0.476635217 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target) Dec 6 03:51:58 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:52:02 localhost podman[74646]: 2025-12-06 08:52:02.561887093 +0000 UTC m=+0.085689332 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, version=17.1.12, url=https://www.redhat.com, container_name=ovn_metadata_agent, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:52:02 localhost podman[74646]: 2025-12-06 08:52:02.61317836 +0000 UTC m=+0.136980599 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:52:02 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Deactivated successfully. Dec 6 03:52:02 localhost podman[74645]: 2025-12-06 08:52:02.616219504 +0000 UTC m=+0.142413857 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Dec 6 03:52:02 localhost podman[74647]: 2025-12-06 08:52:02.669278715 +0000 UTC m=+0.188209403 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:52:02 localhost podman[74645]: 2025-12-06 08:52:02.7023806 +0000 UTC m=+0.228574963 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, release=1761123044, vcs-type=git, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container) Dec 6 03:52:02 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:52:02 localhost podman[74647]: 2025-12-06 08:52:02.750406745 +0000 UTC m=+0.269337433 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 6 03:52:02 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Deactivated successfully. Dec 6 03:52:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:52:06 localhost podman[74715]: 2025-12-06 08:52:06.550948418 +0000 UTC m=+0.080119860 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 6 03:52:06 localhost podman[74715]: 2025-12-06 08:52:06.590322756 +0000 UTC m=+0.119494208 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, release=1761123044, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:52:06 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:52:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:52:09 localhost podman[74734]: 2025-12-06 08:52:09.547791606 +0000 UTC m=+0.083864766 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, container_name=nova_compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:52:09 localhost podman[74734]: 2025-12-06 08:52:09.579417844 +0000 UTC m=+0.115490974 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute) Dec 6 03:52:09 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:52:27 localhost podman[74760]: 2025-12-06 08:52:27.559484163 +0000 UTC m=+0.085745634 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, release=1761123044, maintainer=OpenStack TripleO Team) Dec 6 03:52:27 localhost podman[74761]: 2025-12-06 08:52:27.618404545 +0000 UTC m=+0.141239240 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z) Dec 6 03:52:27 localhost podman[74762]: 2025-12-06 08:52:27.673454899 +0000 UTC m=+0.194050164 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 6 03:52:27 localhost podman[74760]: 2025-12-06 08:52:27.698359159 +0000 UTC m=+0.224620620 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z) Dec 6 03:52:27 localhost podman[74762]: 2025-12-06 08:52:27.698117812 +0000 UTC m=+0.218713087 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible) Dec 6 03:52:27 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:52:27 localhost podman[74763]: 2025-12-06 08:52:27.714167488 +0000 UTC m=+0.231507163 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1) Dec 6 03:52:27 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:52:27 localhost podman[74761]: 2025-12-06 08:52:27.802177281 +0000 UTC m=+0.325011986 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible) Dec 6 03:52:27 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:52:27 localhost podman[74763]: 2025-12-06 08:52:27.929317054 +0000 UTC m=+0.446656729 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=) Dec 6 03:52:27 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:52:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:52:29 localhost podman[74861]: 2025-12-06 08:52:29.540879263 +0000 UTC m=+0.076372074 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Dec 6 03:52:30 localhost podman[74861]: 2025-12-06 08:52:30.022335578 +0000 UTC m=+0.557828359 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, container_name=nova_migration_target, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Dec 6 03:52:30 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:52:33 localhost podman[74889]: 2025-12-06 08:52:33.581034259 +0000 UTC m=+0.106338821 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, container_name=ovn_controller, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vcs-type=git) Dec 6 03:52:33 localhost podman[74889]: 2025-12-06 08:52:33.628359142 +0000 UTC m=+0.153663704 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:52:33 localhost podman[74889]: unhealthy Dec 6 03:52:33 localhost podman[74887]: 2025-12-06 08:52:33.640455687 +0000 UTC m=+0.172848229 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container) Dec 6 03:52:33 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:52:33 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:52:33 localhost podman[74887]: 2025-12-06 08:52:33.654228773 +0000 UTC m=+0.186621355 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:52:33 localhost podman[74888]: 2025-12-06 08:52:33.610381666 +0000 UTC m=+0.141585141 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, distribution-scope=public, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 6 03:52:33 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:52:33 localhost podman[74888]: 2025-12-06 08:52:33.693030844 +0000 UTC m=+0.224234279 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:52:33 localhost podman[74888]: unhealthy Dec 6 03:52:33 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:52:33 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:52:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:52:37 localhost systemd[1]: tmp-crun.5Dn3n3.mount: Deactivated successfully. Dec 6 03:52:37 localhost podman[74945]: 2025-12-06 08:52:37.547945177 +0000 UTC m=+0.083212365 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:52:37 localhost podman[74945]: 2025-12-06 08:52:37.558896866 +0000 UTC m=+0.094164054 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team) Dec 6 03:52:37 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:52:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:52:40 localhost systemd[1]: tmp-crun.OfyzSE.mount: Deactivated successfully. Dec 6 03:52:40 localhost podman[74964]: 2025-12-06 08:52:40.544477863 +0000 UTC m=+0.075068562 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z) Dec 6 03:52:40 localhost podman[74964]: 2025-12-06 08:52:40.579411214 +0000 UTC m=+0.110001923 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Dec 6 03:52:40 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:52:43 localhost sshd[74990]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:52:43 localhost sshd[74992]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:52:58 localhost podman[74993]: 2025-12-06 08:52:58.551918338 +0000 UTC m=+0.085114314 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:52:58 localhost podman[74993]: 2025-12-06 08:52:58.584647911 +0000 UTC m=+0.117843897 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12) Dec 6 03:52:58 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:52:58 localhost podman[74996]: 2025-12-06 08:52:58.664833192 +0000 UTC m=+0.193162907 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, vendor=Red Hat, Inc.) Dec 6 03:52:58 localhost podman[74995]: 2025-12-06 08:52:58.723700943 +0000 UTC m=+0.256052373 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:52:58 localhost podman[74994]: 2025-12-06 08:52:58.772491463 +0000 UTC m=+0.304389849 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z) Dec 6 03:52:58 localhost podman[74994]: 2025-12-06 08:52:58.798115695 +0000 UTC m=+0.330014071 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4) Dec 6 03:52:58 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:52:58 localhost podman[74995]: 2025-12-06 08:52:58.852757266 +0000 UTC m=+0.385108696 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, vcs-type=git, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:52:58 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:52:58 localhost podman[74996]: 2025-12-06 08:52:58.894451266 +0000 UTC m=+0.422780941 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12) Dec 6 03:52:58 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:52:59 localhost systemd[1]: tmp-crun.d43YVH.mount: Deactivated successfully. Dec 6 03:53:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:53:00 localhost systemd[1]: tmp-crun.AnM8La.mount: Deactivated successfully. Dec 6 03:53:00 localhost podman[75091]: 2025-12-06 08:53:00.552951557 +0000 UTC m=+0.082559056 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z) Dec 6 03:53:00 localhost podman[75091]: 2025-12-06 08:53:00.92995065 +0000 UTC m=+0.459558159 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:53:00 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:53:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:53:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:53:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:53:04 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:53:04 localhost recover_tripleo_nova_virtqemud[75128]: 51836 Dec 6 03:53:04 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:53:04 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:53:04 localhost podman[75114]: 2025-12-06 08:53:04.560511354 +0000 UTC m=+0.088201739 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:53:04 localhost podman[75114]: 2025-12-06 08:53:04.569219613 +0000 UTC m=+0.096909958 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:53:04 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:53:04 localhost podman[75115]: 2025-12-06 08:53:04.61466506 +0000 UTC m=+0.138450195 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Dec 6 03:53:04 localhost podman[75116]: 2025-12-06 08:53:04.662597483 +0000 UTC m=+0.183024334 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:53:04 localhost podman[75115]: 2025-12-06 08:53:04.678562316 +0000 UTC m=+0.202347461 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, batch=17.1_20251118.1) Dec 6 03:53:04 localhost podman[75115]: unhealthy Dec 6 03:53:04 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:53:04 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:53:04 localhost podman[75116]: 2025-12-06 08:53:04.731547785 +0000 UTC m=+0.251974616 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git) Dec 6 03:53:04 localhost podman[75116]: unhealthy Dec 6 03:53:04 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:53:04 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:53:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:53:08 localhost systemd[1]: tmp-crun.XplkAc.mount: Deactivated successfully. Dec 6 03:53:08 localhost podman[75176]: 2025-12-06 08:53:08.552042777 +0000 UTC m=+0.086837648 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, name=rhosp17/openstack-iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:53:08 localhost podman[75176]: 2025-12-06 08:53:08.565268536 +0000 UTC m=+0.100063417 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:53:08 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:53:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:53:11 localhost systemd[1]: tmp-crun.KVsi9F.mount: Deactivated successfully. Dec 6 03:53:11 localhost podman[75196]: 2025-12-06 08:53:11.5500614 +0000 UTC m=+0.085939559 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, maintainer=OpenStack TripleO Team) Dec 6 03:53:11 localhost podman[75196]: 2025-12-06 08:53:11.581233325 +0000 UTC m=+0.117111484 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git) Dec 6 03:53:11 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:53:29 localhost systemd[1]: tmp-crun.75Sc0k.mount: Deactivated successfully. Dec 6 03:53:29 localhost podman[75224]: 2025-12-06 08:53:29.544153538 +0000 UTC m=+0.072655434 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 6 03:53:29 localhost podman[75222]: 2025-12-06 08:53:29.595820724 +0000 UTC m=+0.125986671 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 6 03:53:29 localhost systemd[1]: tmp-crun.4yXdYE.mount: Deactivated successfully. Dec 6 03:53:29 localhost podman[75224]: 2025-12-06 08:53:29.600694734 +0000 UTC m=+0.129196640 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible) Dec 6 03:53:29 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:53:29 localhost podman[75225]: 2025-12-06 08:53:29.571178293 +0000 UTC m=+0.090797194 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr) Dec 6 03:53:29 localhost podman[75222]: 2025-12-06 08:53:29.635162348 +0000 UTC m=+0.165328285 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4) Dec 6 03:53:29 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:53:29 localhost podman[75223]: 2025-12-06 08:53:29.708819022 +0000 UTC m=+0.234403507 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 6 03:53:29 localhost podman[75223]: 2025-12-06 08:53:29.738359564 +0000 UTC m=+0.263944089 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Dec 6 03:53:29 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:53:29 localhost podman[75225]: 2025-12-06 08:53:29.76220745 +0000 UTC m=+0.281826351 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:53:29 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:53:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:53:31 localhost systemd[1]: tmp-crun.5YvVWR.mount: Deactivated successfully. Dec 6 03:53:31 localhost podman[75319]: 2025-12-06 08:53:31.540935047 +0000 UTC m=+0.078310628 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Dec 6 03:53:31 localhost podman[75319]: 2025-12-06 08:53:31.906611018 +0000 UTC m=+0.443986649 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:53:31 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:53:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:53:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:53:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:53:35 localhost systemd[1]: tmp-crun.cNrRPV.mount: Deactivated successfully. Dec 6 03:53:35 localhost podman[75342]: 2025-12-06 08:53:35.541613886 +0000 UTC m=+0.074449059 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com) Dec 6 03:53:35 localhost podman[75342]: 2025-12-06 08:53:35.551055898 +0000 UTC m=+0.083891101 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64) Dec 6 03:53:35 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:53:35 localhost systemd[1]: tmp-crun.plXztP.mount: Deactivated successfully. Dec 6 03:53:35 localhost podman[75344]: 2025-12-06 08:53:35.592065684 +0000 UTC m=+0.118993685 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ovn_controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:53:35 localhost podman[75344]: 2025-12-06 08:53:35.628084996 +0000 UTC m=+0.155013007 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:53:35 localhost podman[75343]: 2025-12-06 08:53:35.643670378 +0000 UTC m=+0.170705132 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 6 03:53:35 localhost podman[75343]: 2025-12-06 08:53:35.660181838 +0000 UTC m=+0.187216622 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:53:35 localhost podman[75343]: unhealthy Dec 6 03:53:35 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:53:35 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:53:35 localhost podman[75344]: unhealthy Dec 6 03:53:35 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:53:35 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:53:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:53:39 localhost podman[75401]: 2025-12-06 08:53:39.53498263 +0000 UTC m=+0.067214846 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:53:39 localhost podman[75401]: 2025-12-06 08:53:39.567933288 +0000 UTC m=+0.100165474 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=iscsid, config_id=tripleo_step3, architecture=x86_64) Dec 6 03:53:39 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:53:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:53:42 localhost podman[75421]: 2025-12-06 08:53:42.545756886 +0000 UTC m=+0.076428521 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step5, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12) Dec 6 03:53:42 localhost podman[75421]: 2025-12-06 08:53:42.597297297 +0000 UTC m=+0.127968962 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_compute) Dec 6 03:53:42 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:54:00 localhost podman[75448]: 2025-12-06 08:54:00.547305554 +0000 UTC m=+0.074162991 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:54:00 localhost podman[75449]: 2025-12-06 08:54:00.608285626 +0000 UTC m=+0.134971998 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Dec 6 03:54:00 localhost podman[75455]: 2025-12-06 08:54:00.576878346 +0000 UTC m=+0.096811059 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4) Dec 6 03:54:00 localhost podman[75449]: 2025-12-06 08:54:00.632226275 +0000 UTC m=+0.158912727 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:54:00 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:54:00 localhost podman[75448]: 2025-12-06 08:54:00.682725495 +0000 UTC m=+0.209582982 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64) Dec 6 03:54:00 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:54:00 localhost podman[75447]: 2025-12-06 08:54:00.761271709 +0000 UTC m=+0.293988677 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vendor=Red Hat, Inc.) Dec 6 03:54:00 localhost podman[75447]: 2025-12-06 08:54:00.774314533 +0000 UTC m=+0.307031481 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z) Dec 6 03:54:00 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:54:00 localhost podman[75455]: 2025-12-06 08:54:00.82704816 +0000 UTC m=+0.346980813 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, container_name=metrics_qdr, version=17.1.12) Dec 6 03:54:00 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:54:02 localhost podman[75544]: 2025-12-06 08:54:02.557244189 +0000 UTC m=+0.091090294 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Dec 6 03:54:02 localhost podman[75544]: 2025-12-06 08:54:02.954185585 +0000 UTC m=+0.488031690 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team) Dec 6 03:54:02 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:54:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:54:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:54:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:54:06 localhost systemd[1]: tmp-crun.pVgaac.mount: Deactivated successfully. Dec 6 03:54:06 localhost podman[75567]: 2025-12-06 08:54:06.569122073 +0000 UTC m=+0.098124600 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd) Dec 6 03:54:06 localhost podman[75568]: 2025-12-06 08:54:06.61110972 +0000 UTC m=+0.134877285 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:54:06 localhost podman[75567]: 2025-12-06 08:54:06.616325161 +0000 UTC m=+0.145327698 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, tcib_managed=true, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step3) Dec 6 03:54:06 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:54:06 localhost podman[75568]: 2025-12-06 08:54:06.695556747 +0000 UTC m=+0.219324312 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Dec 6 03:54:06 localhost podman[75568]: unhealthy Dec 6 03:54:06 localhost podman[75569]: 2025-12-06 08:54:06.705436812 +0000 UTC m=+0.227538656 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, version=17.1.12, url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:54:06 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:54:06 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:54:06 localhost podman[75569]: 2025-12-06 08:54:06.770017726 +0000 UTC m=+0.292119530 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:54:06 localhost podman[75569]: unhealthy Dec 6 03:54:06 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:54:06 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:54:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:54:10 localhost podman[75625]: 2025-12-06 08:54:10.539010851 +0000 UTC m=+0.074449649 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, architecture=x86_64) Dec 6 03:54:10 localhost podman[75625]: 2025-12-06 08:54:10.579404559 +0000 UTC m=+0.114843377 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Dec 6 03:54:10 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:54:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:54:13 localhost systemd[1]: tmp-crun.ycDTWm.mount: Deactivated successfully. Dec 6 03:54:13 localhost podman[75645]: 2025-12-06 08:54:13.560749106 +0000 UTC m=+0.091737444 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 6 03:54:13 localhost podman[75645]: 2025-12-06 08:54:13.614727492 +0000 UTC m=+0.145715810 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, url=https://www.redhat.com, container_name=nova_compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc.) Dec 6 03:54:13 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:54:27 localhost sshd[75671]: main: sshd: ssh-rsa algorithm is disabled Dec 6 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:54:31 localhost podman[75673]: 2025-12-06 08:54:31.537076345 +0000 UTC m=+0.068156086 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, tcib_managed=true) Dec 6 03:54:31 localhost systemd[1]: tmp-crun.oFPReM.mount: Deactivated successfully. Dec 6 03:54:31 localhost podman[75681]: 2025-12-06 08:54:31.567915107 +0000 UTC m=+0.087160242 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Dec 6 03:54:31 localhost podman[75673]: 2025-12-06 08:54:31.618501079 +0000 UTC m=+0.149580840 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, url=https://www.redhat.com) Dec 6 03:54:31 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:54:31 localhost podman[75680]: 2025-12-06 08:54:31.639973811 +0000 UTC m=+0.158794843 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:54:31 localhost podman[75680]: 2025-12-06 08:54:31.694427293 +0000 UTC m=+0.213248355 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git) Dec 6 03:54:31 localhost podman[75674]: 2025-12-06 08:54:31.723103218 +0000 UTC m=+0.246557243 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:54:31 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:54:31 localhost podman[75674]: 2025-12-06 08:54:31.777238469 +0000 UTC m=+0.300692484 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vcs-type=git, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:54:31 localhost podman[75681]: 2025-12-06 08:54:31.784086711 +0000 UTC m=+0.303331836 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:54:31 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:54:31 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:54:33 localhost podman[75773]: 2025-12-06 08:54:33.54734661 +0000 UTC m=+0.078707441 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:54:33 localhost podman[75773]: 2025-12-06 08:54:33.913263988 +0000 UTC m=+0.444624759 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target) Dec 6 03:54:33 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:54:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:54:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:54:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:54:37 localhost systemd[1]: tmp-crun.7g7p0l.mount: Deactivated successfully. Dec 6 03:54:37 localhost podman[75798]: 2025-12-06 08:54:37.606079132 +0000 UTC m=+0.113368411 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:54:37 localhost podman[75797]: 2025-12-06 08:54:37.691386536 +0000 UTC m=+0.202628388 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-collectd-container) Dec 6 03:54:37 localhost podman[75798]: 2025-12-06 08:54:37.70285679 +0000 UTC m=+0.210146069 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_metadata_agent, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true) Dec 6 03:54:37 localhost podman[75798]: unhealthy Dec 6 03:54:37 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:54:37 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:54:37 localhost podman[75797]: 2025-12-06 08:54:37.729613816 +0000 UTC m=+0.240855668 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:54:37 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:54:37 localhost podman[75799]: 2025-12-06 08:54:37.788641468 +0000 UTC m=+0.291651575 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, distribution-scope=public, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, architecture=x86_64, container_name=ovn_controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container) Dec 6 03:54:37 localhost podman[75799]: 2025-12-06 08:54:37.802329911 +0000 UTC m=+0.305340038 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}) Dec 6 03:54:37 localhost podman[75799]: unhealthy Dec 6 03:54:37 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:54:37 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:54:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:54:41 localhost podman[75858]: 2025-12-06 08:54:41.553794115 +0000 UTC m=+0.088055959 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 6 03:54:41 localhost podman[75858]: 2025-12-06 08:54:41.561172753 +0000 UTC m=+0.095434587 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3) Dec 6 03:54:41 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:54:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:54:44 localhost podman[75877]: 2025-12-06 08:54:44.548840946 +0000 UTC m=+0.079618089 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, architecture=x86_64) Dec 6 03:54:44 localhost podman[75877]: 2025-12-06 08:54:44.583344601 +0000 UTC m=+0.114121734 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, version=17.1.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, release=1761123044) Dec 6 03:54:44 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:54:58 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:54:58 localhost recover_tripleo_nova_virtqemud[75902]: 51836 Dec 6 03:54:58 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:54:58 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:55:02 localhost podman[75903]: 2025-12-06 08:55:02.558744161 +0000 UTC m=+0.090249677 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, batch=17.1_20251118.1, name=rhosp17/openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:55:02 localhost podman[75903]: 2025-12-06 08:55:02.567597426 +0000 UTC m=+0.099102992 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Dec 6 03:55:02 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:55:02 localhost podman[75905]: 2025-12-06 08:55:02.630945561 +0000 UTC m=+0.154041727 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc.) Dec 6 03:55:02 localhost podman[75905]: 2025-12-06 08:55:02.660215315 +0000 UTC m=+0.183311461 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:55:02 localhost systemd[1]: tmp-crun.MtSgen.mount: Deactivated successfully. Dec 6 03:55:02 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:55:02 localhost podman[75911]: 2025-12-06 08:55:02.673765063 +0000 UTC m=+0.193877967 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc.) Dec 6 03:55:02 localhost podman[75904]: 2025-12-06 08:55:02.715049908 +0000 UTC m=+0.242851979 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:55:02 localhost podman[75904]: 2025-12-06 08:55:02.7433112 +0000 UTC m=+0.271113241 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ceilometer_agent_ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:55:02 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:55:02 localhost podman[75911]: 2025-12-06 08:55:02.856213647 +0000 UTC m=+0.376326531 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Dec 6 03:55:02 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:55:04 localhost podman[76003]: 2025-12-06 08:55:04.560277928 +0000 UTC m=+0.088731111 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:55:04 localhost podman[76003]: 2025-12-06 08:55:04.932224772 +0000 UTC m=+0.460677965 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:55:04 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:55:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:55:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:55:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:55:08 localhost podman[76028]: 2025-12-06 08:55:08.566953252 +0000 UTC m=+0.089731721 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Dec 6 03:55:08 localhost podman[76028]: 2025-12-06 08:55:08.585322519 +0000 UTC m=+0.108100989 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044) Dec 6 03:55:08 localhost podman[76028]: unhealthy Dec 6 03:55:08 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:55:08 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:55:08 localhost podman[76029]: 2025-12-06 08:55:08.672337716 +0000 UTC m=+0.189798941 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc.) Dec 6 03:55:08 localhost podman[76027]: 2025-12-06 08:55:08.541577049 +0000 UTC m=+0.069485617 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, container_name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:55:08 localhost podman[76029]: 2025-12-06 08:55:08.716256341 +0000 UTC m=+0.233717556 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 6 03:55:08 localhost podman[76029]: unhealthy Dec 6 03:55:08 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:55:08 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:55:08 localhost podman[76027]: 2025-12-06 08:55:08.729265443 +0000 UTC m=+0.257174011 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true) Dec 6 03:55:08 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:55:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:55:12 localhost podman[76086]: 2025-12-06 08:55:12.55626911 +0000 UTC m=+0.083207561 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, release=1761123044, io.openshift.expose-services=, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:55:12 localhost podman[76086]: 2025-12-06 08:55:12.564891176 +0000 UTC m=+0.091829637 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Dec 6 03:55:12 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:55:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:55:15 localhost podman[76104]: 2025-12-06 08:55:15.545671286 +0000 UTC m=+0.079399322 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Dec 6 03:55:15 localhost podman[76104]: 2025-12-06 08:55:15.577168769 +0000 UTC m=+0.110896875 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 03:55:15 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:55:33 localhost systemd[1]: tmp-crun.CYZ65R.mount: Deactivated successfully. Dec 6 03:55:33 localhost podman[76132]: 2025-12-06 08:55:33.536524204 +0000 UTC m=+0.064118580 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute) Dec 6 03:55:33 localhost podman[76135]: 2025-12-06 08:55:33.593368529 +0000 UTC m=+0.114250398 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1) Dec 6 03:55:33 localhost podman[76130]: 2025-12-06 08:55:33.614991386 +0000 UTC m=+0.145972527 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public) Dec 6 03:55:33 localhost podman[76132]: 2025-12-06 08:55:33.623180849 +0000 UTC m=+0.150775185 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, architecture=x86_64, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12) Dec 6 03:55:33 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:55:33 localhost podman[76130]: 2025-12-06 08:55:33.677946231 +0000 UTC m=+0.208927382 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:55:33 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:55:33 localhost podman[76131]: 2025-12-06 08:55:33.765210495 +0000 UTC m=+0.291771320 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1) Dec 6 03:55:33 localhost podman[76131]: 2025-12-06 08:55:33.812656709 +0000 UTC m=+0.339217544 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Dec 6 03:55:33 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:55:33 localhost podman[76135]: 2025-12-06 08:55:33.843250504 +0000 UTC m=+0.364132403 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Dec 6 03:55:33 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:55:34 localhost systemd[1]: tmp-crun.yD6gnb.mount: Deactivated successfully. Dec 6 03:55:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:55:35 localhost podman[76230]: 2025-12-06 08:55:35.544411316 +0000 UTC m=+0.075577674 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, release=1761123044) Dec 6 03:55:35 localhost podman[76230]: 2025-12-06 08:55:35.930790086 +0000 UTC m=+0.461956414 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Dec 6 03:55:35 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:55:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:55:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:55:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:55:39 localhost systemd[1]: tmp-crun.7YBPyj.mount: Deactivated successfully. Dec 6 03:55:39 localhost podman[76255]: 2025-12-06 08:55:39.561076409 +0000 UTC m=+0.090779733 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64) Dec 6 03:55:39 localhost podman[76255]: 2025-12-06 08:55:39.580105337 +0000 UTC m=+0.109808671 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12) Dec 6 03:55:39 localhost podman[76255]: unhealthy Dec 6 03:55:39 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:55:39 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:55:39 localhost podman[76254]: 2025-12-06 08:55:39.670317312 +0000 UTC m=+0.203773383 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Dec 6 03:55:39 localhost podman[76254]: 2025-12-06 08:55:39.683258952 +0000 UTC m=+0.216714993 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, managed_by=tripleo_ansible) Dec 6 03:55:39 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:55:39 localhost podman[76256]: 2025-12-06 08:55:39.77198228 +0000 UTC m=+0.297142265 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:55:39 localhost podman[76256]: 2025-12-06 08:55:39.813206913 +0000 UTC m=+0.338366858 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=) Dec 6 03:55:39 localhost podman[76256]: unhealthy Dec 6 03:55:39 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:55:39 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:55:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:55:43 localhost systemd[1]: tmp-crun.sLLWe9.mount: Deactivated successfully. Dec 6 03:55:43 localhost podman[76316]: 2025-12-06 08:55:43.561046666 +0000 UTC m=+0.093902801 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:55:43 localhost podman[76316]: 2025-12-06 08:55:43.599284626 +0000 UTC m=+0.132140731 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.4, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true) Dec 6 03:55:43 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:55:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:55:46 localhost podman[76333]: 2025-12-06 08:55:46.545610063 +0000 UTC m=+0.080223468 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:55:46 localhost podman[76333]: 2025-12-06 08:55:46.575349791 +0000 UTC m=+0.109963216 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=nova_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 6 03:55:46 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:56:04 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:56:04 localhost recover_tripleo_nova_virtqemud[76385]: 51836 Dec 6 03:56:04 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:56:04 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:56:04 localhost systemd[1]: tmp-crun.NcYgig.mount: Deactivated successfully. Dec 6 03:56:04 localhost podman[76361]: 2025-12-06 08:56:04.620587557 +0000 UTC m=+0.143014596 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 6 03:56:04 localhost podman[76360]: 2025-12-06 08:56:04.586479624 +0000 UTC m=+0.110267165 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 6 03:56:04 localhost podman[76360]: 2025-12-06 08:56:04.671357835 +0000 UTC m=+0.195145346 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:56:04 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:56:04 localhost podman[76367]: 2025-12-06 08:56:04.685910465 +0000 UTC m=+0.203973529 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, container_name=metrics_qdr, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Dec 6 03:56:04 localhost podman[76361]: 2025-12-06 08:56:04.709360778 +0000 UTC m=+0.231787817 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1) Dec 6 03:56:04 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:56:04 localhost podman[76359]: 2025-12-06 08:56:04.767412281 +0000 UTC m=+0.296581638 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 6 03:56:04 localhost podman[76359]: 2025-12-06 08:56:04.781139945 +0000 UTC m=+0.310309282 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Dec 6 03:56:04 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:56:04 localhost podman[76367]: 2025-12-06 08:56:04.922486298 +0000 UTC m=+0.440549362 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, release=1761123044, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true) Dec 6 03:56:04 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:56:05 localhost systemd[1]: tmp-crun.Wjt3ln.mount: Deactivated successfully. Dec 6 03:56:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:56:06 localhost podman[76463]: 2025-12-06 08:56:06.550127071 +0000 UTC m=+0.081339952 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, container_name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4) Dec 6 03:56:06 localhost podman[76463]: 2025-12-06 08:56:06.941328009 +0000 UTC m=+0.472540890 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:56:06 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:56:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:56:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:56:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:56:10 localhost podman[76487]: 2025-12-06 08:56:10.562254373 +0000 UTC m=+0.090423342 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:56:10 localhost podman[76487]: 2025-12-06 08:56:10.606306364 +0000 UTC m=+0.134475323 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:56:10 localhost podman[76487]: unhealthy Dec 6 03:56:10 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:56:10 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:56:10 localhost systemd[1]: tmp-crun.qfT3ex.mount: Deactivated successfully. Dec 6 03:56:10 localhost podman[76488]: 2025-12-06 08:56:10.723545193 +0000 UTC m=+0.247734320 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:56:10 localhost podman[76488]: 2025-12-06 08:56:10.768205912 +0000 UTC m=+0.292395099 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1761123044, url=https://www.redhat.com, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=) Dec 6 03:56:10 localhost podman[76488]: unhealthy Dec 6 03:56:10 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:56:10 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:56:10 localhost podman[76486]: 2025-12-06 08:56:10.825719807 +0000 UTC m=+0.354348491 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3) Dec 6 03:56:10 localhost podman[76486]: 2025-12-06 08:56:10.836370627 +0000 UTC m=+0.364999310 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:56:10 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:56:11 localhost systemd[1]: tmp-crun.EdZX4T.mount: Deactivated successfully. Dec 6 03:56:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:56:14 localhost podman[76546]: 2025-12-06 08:56:14.541105969 +0000 UTC m=+0.073754099 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, version=17.1.12, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:56:14 localhost podman[76546]: 2025-12-06 08:56:14.548808966 +0000 UTC m=+0.081457076 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=iscsid, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:56:14 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:56:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:56:17 localhost systemd[1]: tmp-crun.xSP9xn.mount: Deactivated successfully. Dec 6 03:56:17 localhost podman[76565]: 2025-12-06 08:56:17.559555561 +0000 UTC m=+0.094968853 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.openshift.expose-services=) Dec 6 03:56:17 localhost podman[76565]: 2025-12-06 08:56:17.59220605 +0000 UTC m=+0.127619382 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:56:17 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:56:35 localhost podman[76592]: 2025-12-06 08:56:35.567232889 +0000 UTC m=+0.090768163 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:56:35 localhost systemd[1]: tmp-crun.fvsnrf.mount: Deactivated successfully. Dec 6 03:56:35 localhost podman[76593]: 2025-12-06 08:56:35.616605543 +0000 UTC m=+0.136874597 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:56:35 localhost podman[76596]: 2025-12-06 08:56:35.669028912 +0000 UTC m=+0.186665614 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:56:35 localhost podman[76593]: 2025-12-06 08:56:35.692371712 +0000 UTC m=+0.212640746 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Dec 6 03:56:35 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:56:35 localhost podman[76591]: 2025-12-06 08:56:35.776785779 +0000 UTC m=+0.304018148 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=logrotate_crond, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 6 03:56:35 localhost podman[76591]: 2025-12-06 08:56:35.783827235 +0000 UTC m=+0.311059675 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Dec 6 03:56:35 localhost podman[76592]: 2025-12-06 08:56:35.794943789 +0000 UTC m=+0.318479023 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:56:35 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:56:35 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:56:35 localhost podman[76596]: 2025-12-06 08:56:35.874717882 +0000 UTC m=+0.392354604 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, url=https://www.redhat.com) Dec 6 03:56:35 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:56:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:56:37 localhost systemd[1]: tmp-crun.41oWCB.mount: Deactivated successfully. Dec 6 03:56:37 localhost podman[76690]: 2025-12-06 08:56:37.556373502 +0000 UTC m=+0.090366591 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044) Dec 6 03:56:37 localhost podman[76690]: 2025-12-06 08:56:37.935304552 +0000 UTC m=+0.469297601 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, container_name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1) Dec 6 03:56:37 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:56:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:56:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:56:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:56:41 localhost podman[76715]: 2025-12-06 08:56:41.544307728 +0000 UTC m=+0.078854766 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1) Dec 6 03:56:41 localhost systemd[1]: tmp-crun.DP5nIq.mount: Deactivated successfully. Dec 6 03:56:41 localhost podman[76716]: 2025-12-06 08:56:41.611115471 +0000 UTC m=+0.141972955 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public) Dec 6 03:56:41 localhost podman[76716]: 2025-12-06 08:56:41.651330552 +0000 UTC m=+0.182188086 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Dec 6 03:56:41 localhost podman[76716]: unhealthy Dec 6 03:56:41 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:56:41 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:56:41 localhost podman[76715]: 2025-12-06 08:56:41.683590878 +0000 UTC m=+0.218137956 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 6 03:56:41 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:56:41 localhost podman[76717]: 2025-12-06 08:56:41.660775313 +0000 UTC m=+0.187399457 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 6 03:56:41 localhost podman[76717]: 2025-12-06 08:56:41.741626189 +0000 UTC m=+0.268250293 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public) Dec 6 03:56:41 localhost podman[76717]: unhealthy Dec 6 03:56:41 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:56:41 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:56:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:56:45 localhost systemd[1]: tmp-crun.ZeM7Kp.mount: Deactivated successfully. Dec 6 03:56:45 localhost podman[76778]: 2025-12-06 08:56:45.534466601 +0000 UTC m=+0.071486197 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team) Dec 6 03:56:45 localhost podman[76778]: 2025-12-06 08:56:45.569323168 +0000 UTC m=+0.106342704 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team) Dec 6 03:56:45 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:56:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:56:48 localhost podman[76799]: 2025-12-06 08:56:48.558045182 +0000 UTC m=+0.092971421 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 03:56:48 localhost podman[76799]: 2025-12-06 08:56:48.588796192 +0000 UTC m=+0.123722411 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:56:48 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:57:06 localhost systemd[1]: tmp-crun.Izq195.mount: Deactivated successfully. Dec 6 03:57:06 localhost podman[76825]: 2025-12-06 08:57:06.56117035 +0000 UTC m=+0.093883520 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Dec 6 03:57:06 localhost systemd[1]: tmp-crun.OSYkS7.mount: Deactivated successfully. Dec 6 03:57:06 localhost podman[76825]: 2025-12-06 08:57:06.623901406 +0000 UTC m=+0.156614566 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:57:06 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:57:06 localhost podman[76826]: 2025-12-06 08:57:06.603457745 +0000 UTC m=+0.129069176 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12) Dec 6 03:57:06 localhost podman[76833]: 2025-12-06 08:57:06.672681803 +0000 UTC m=+0.191732782 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044) Dec 6 03:57:06 localhost podman[76826]: 2025-12-06 08:57:06.686101127 +0000 UTC m=+0.211712548 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, distribution-scope=public) Dec 6 03:57:06 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:57:06 localhost podman[76827]: 2025-12-06 08:57:06.628945012 +0000 UTC m=+0.152016614 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute) Dec 6 03:57:06 localhost podman[76827]: 2025-12-06 08:57:06.764983962 +0000 UTC m=+0.288055604 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Dec 6 03:57:06 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:57:06 localhost podman[76833]: 2025-12-06 08:57:06.893639225 +0000 UTC m=+0.412690264 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, distribution-scope=public) Dec 6 03:57:06 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:57:07 localhost systemd[1]: tmp-crun.yyFue7.mount: Deactivated successfully. Dec 6 03:57:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:57:08 localhost podman[76926]: 2025-12-06 08:57:08.541640876 +0000 UTC m=+0.072730707 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, container_name=nova_migration_target, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 03:57:08 localhost podman[76926]: 2025-12-06 08:57:08.933309368 +0000 UTC m=+0.464399169 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Dec 6 03:57:08 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:57:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:57:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:57:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:57:12 localhost systemd[1]: tmp-crun.fgBgVG.mount: Deactivated successfully. Dec 6 03:57:12 localhost podman[76947]: 2025-12-06 08:57:12.56129016 +0000 UTC m=+0.091830126 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public) Dec 6 03:57:12 localhost podman[76947]: 2025-12-06 08:57:12.56842666 +0000 UTC m=+0.098966706 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, name=rhosp17/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044) Dec 6 03:57:12 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:57:12 localhost podman[76948]: 2025-12-06 08:57:12.643024324 +0000 UTC m=+0.172198668 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 03:57:12 localhost podman[76948]: 2025-12-06 08:57:12.660115251 +0000 UTC m=+0.189289605 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 6 03:57:12 localhost podman[76948]: unhealthy Dec 6 03:57:12 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:57:12 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:57:12 localhost podman[76949]: 2025-12-06 08:57:12.710308121 +0000 UTC m=+0.238956938 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, name=rhosp17/openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, build-date=2025-11-18T23:34:05Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 6 03:57:12 localhost podman[76949]: 2025-12-06 08:57:12.746587831 +0000 UTC m=+0.275236648 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc.) Dec 6 03:57:12 localhost podman[76949]: unhealthy Dec 6 03:57:12 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:57:12 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:57:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:57:16 localhost podman[77008]: 2025-12-06 08:57:16.544911913 +0000 UTC m=+0.080354363 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:57:16 localhost podman[77008]: 2025-12-06 08:57:16.57852394 +0000 UTC m=+0.113966420 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:57:16 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:57:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:57:19 localhost podman[77028]: 2025-12-06 08:57:19.553968065 +0000 UTC m=+0.088264026 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Dec 6 03:57:19 localhost podman[77028]: 2025-12-06 08:57:19.603710191 +0000 UTC m=+0.138006102 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute) Dec 6 03:57:19 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:57:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:57:37 localhost recover_tripleo_nova_virtqemud[77080]: 51836 Dec 6 03:57:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:57:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:57:37 localhost podman[77055]: 2025-12-06 08:57:37.563289204 +0000 UTC m=+0.093825158 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64) Dec 6 03:57:37 localhost podman[77055]: 2025-12-06 08:57:37.598295764 +0000 UTC m=+0.128831678 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4) Dec 6 03:57:37 localhost systemd[1]: tmp-crun.MZ8OVl.mount: Deactivated successfully. Dec 6 03:57:37 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:57:37 localhost podman[77058]: 2025-12-06 08:57:37.666710857 +0000 UTC m=+0.188690137 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible) Dec 6 03:57:37 localhost podman[77056]: 2025-12-06 08:57:37.724065537 +0000 UTC m=+0.251635860 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi) Dec 6 03:57:37 localhost podman[77057]: 2025-12-06 08:57:37.635191183 +0000 UTC m=+0.160102094 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:57:37 localhost podman[77057]: 2025-12-06 08:57:37.764252538 +0000 UTC m=+0.289163369 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true) Dec 6 03:57:37 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:57:37 localhost podman[77056]: 2025-12-06 08:57:37.774912027 +0000 UTC m=+0.302482330 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044) Dec 6 03:57:37 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:57:37 localhost podman[77058]: 2025-12-06 08:57:37.880460326 +0000 UTC m=+0.402439586 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64) Dec 6 03:57:37 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:57:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:57:39 localhost podman[77159]: 2025-12-06 08:57:39.557179964 +0000 UTC m=+0.090727912 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git) Dec 6 03:57:39 localhost podman[77159]: 2025-12-06 08:57:39.90720336 +0000 UTC m=+0.440751288 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 6 03:57:39 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:57:43 localhost podman[77183]: 2025-12-06 08:57:43.564718335 +0000 UTC m=+0.087469642 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Dec 6 03:57:43 localhost podman[77183]: 2025-12-06 08:57:43.585330951 +0000 UTC m=+0.108082268 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:57:43 localhost podman[77183]: unhealthy Dec 6 03:57:43 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:57:43 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:57:43 localhost podman[77184]: 2025-12-06 08:57:43.667531759 +0000 UTC m=+0.187773498 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 03:57:43 localhost podman[77184]: 2025-12-06 08:57:43.681671856 +0000 UTC m=+0.201913635 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:57:43 localhost podman[77184]: unhealthy Dec 6 03:57:43 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:57:43 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:57:43 localhost systemd[1]: tmp-crun.DGZDeA.mount: Deactivated successfully. Dec 6 03:57:43 localhost podman[77182]: 2025-12-06 08:57:43.778942348 +0000 UTC m=+0.304486921 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, container_name=collectd, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible) Dec 6 03:57:43 localhost podman[77182]: 2025-12-06 08:57:43.792253589 +0000 UTC m=+0.317798192 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 6 03:57:43 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:57:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:57:47 localhost podman[77242]: 2025-12-06 08:57:47.543002172 +0000 UTC m=+0.078107493 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 03:57:47 localhost podman[77242]: 2025-12-06 08:57:47.557489639 +0000 UTC m=+0.092594960 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:57:47 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:57:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:57:50 localhost podman[77261]: 2025-12-06 08:57:50.535198094 +0000 UTC m=+0.073095707 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, container_name=nova_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64) Dec 6 03:57:50 localhost podman[77261]: 2025-12-06 08:57:50.586182588 +0000 UTC m=+0.124080181 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Dec 6 03:57:50 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:58:08 localhost systemd[1]: tmp-crun.IRWOV0.mount: Deactivated successfully. Dec 6 03:58:08 localhost podman[77287]: 2025-12-06 08:58:08.553105598 +0000 UTC m=+0.083617983 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, config_id=tripleo_step4, vcs-type=git, container_name=logrotate_crond, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Dec 6 03:58:08 localhost podman[77287]: 2025-12-06 08:58:08.583772205 +0000 UTC m=+0.114284570 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step4) Dec 6 03:58:08 localhost podman[77289]: 2025-12-06 08:58:08.59429837 +0000 UTC m=+0.118187361 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 6 03:58:08 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:58:08 localhost podman[77289]: 2025-12-06 08:58:08.61635836 +0000 UTC m=+0.140247341 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Dec 6 03:58:08 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:58:08 localhost podman[77291]: 2025-12-06 08:58:08.629237228 +0000 UTC m=+0.155067818 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 03:58:08 localhost podman[77288]: 2025-12-06 08:58:08.71578085 +0000 UTC m=+0.242829458 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:12:45Z) Dec 6 03:58:08 localhost podman[77288]: 2025-12-06 08:58:08.765262258 +0000 UTC m=+0.292310856 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 03:58:08 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:58:08 localhost podman[77291]: 2025-12-06 08:58:08.829225833 +0000 UTC m=+0.355056423 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 03:58:08 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:58:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:58:10 localhost podman[77386]: 2025-12-06 08:58:10.54671577 +0000 UTC m=+0.084024505 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:58:10 localhost podman[77386]: 2025-12-06 08:58:10.899241893 +0000 UTC m=+0.436550568 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target) Dec 6 03:58:10 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:58:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:58:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:58:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:58:14 localhost systemd[1]: tmp-crun.T9wdai.mount: Deactivated successfully. Dec 6 03:58:14 localhost podman[77408]: 2025-12-06 08:58:14.567566651 +0000 UTC m=+0.099382700 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com) Dec 6 03:58:14 localhost podman[77408]: 2025-12-06 08:58:14.576783266 +0000 UTC m=+0.108599365 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com) Dec 6 03:58:14 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:58:14 localhost podman[77409]: 2025-12-06 08:58:14.657269021 +0000 UTC m=+0.185398265 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1) Dec 6 03:58:14 localhost podman[77409]: 2025-12-06 08:58:14.700049421 +0000 UTC m=+0.228178705 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:58:14 localhost podman[77409]: unhealthy Dec 6 03:58:14 localhost podman[77410]: 2025-12-06 08:58:14.710917087 +0000 UTC m=+0.236452831 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 03:58:14 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:58:14 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:58:14 localhost podman[77410]: 2025-12-06 08:58:14.727176739 +0000 UTC m=+0.252712533 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git) Dec 6 03:58:14 localhost podman[77410]: unhealthy Dec 6 03:58:14 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:58:14 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:58:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:58:18 localhost systemd[1]: tmp-crun.IiO0Lg.mount: Deactivated successfully. Dec 6 03:58:18 localhost podman[77468]: 2025-12-06 08:58:18.544866299 +0000 UTC m=+0.079563048 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12) Dec 6 03:58:18 localhost podman[77468]: 2025-12-06 08:58:18.556077585 +0000 UTC m=+0.090774334 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, container_name=iscsid, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team) Dec 6 03:58:18 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:58:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:58:21 localhost podman[77487]: 2025-12-06 08:58:21.534493562 +0000 UTC m=+0.070318682 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 6 03:58:21 localhost podman[77487]: 2025-12-06 08:58:21.591297995 +0000 UTC m=+0.127123165 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Dec 6 03:58:21 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:58:39 localhost podman[77522]: 2025-12-06 08:58:39.535969717 +0000 UTC m=+0.060510299 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64) Dec 6 03:58:39 localhost podman[77514]: 2025-12-06 08:58:39.601363286 +0000 UTC m=+0.132848022 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git) Dec 6 03:58:39 localhost podman[77514]: 2025-12-06 08:58:39.608158875 +0000 UTC m=+0.139643601 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 03:58:39 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:58:39 localhost podman[77516]: 2025-12-06 08:58:39.661943797 +0000 UTC m=+0.186375066 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:58:39 localhost podman[77515]: 2025-12-06 08:58:39.711928609 +0000 UTC m=+0.239256728 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 03:58:39 localhost podman[77522]: 2025-12-06 08:58:39.746433115 +0000 UTC m=+0.270973637 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible) Dec 6 03:58:39 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:58:39 localhost podman[77515]: 2025-12-06 08:58:39.759437346 +0000 UTC m=+0.286765415 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi) Dec 6 03:58:39 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:58:39 localhost podman[77516]: 2025-12-06 08:58:39.768187817 +0000 UTC m=+0.292619066 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:58:39 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:58:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:58:41 localhost podman[77618]: 2025-12-06 08:58:41.537877514 +0000 UTC m=+0.073528390 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 6 03:58:41 localhost podman[77618]: 2025-12-06 08:58:41.927795883 +0000 UTC m=+0.463446789 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z) Dec 6 03:58:41 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:58:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:58:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:58:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:58:45 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 03:58:45 localhost recover_tripleo_nova_virtqemud[77656]: 51836 Dec 6 03:58:45 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 03:58:45 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 03:58:45 localhost systemd[1]: tmp-crun.eEr6oA.mount: Deactivated successfully. Dec 6 03:58:45 localhost podman[77642]: 2025-12-06 08:58:45.578995352 +0000 UTC m=+0.103514207 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, container_name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible) Dec 6 03:58:45 localhost podman[77642]: 2025-12-06 08:58:45.587356631 +0000 UTC m=+0.111875466 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:58:45 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:58:45 localhost podman[77644]: 2025-12-06 08:58:45.601289201 +0000 UTC m=+0.113025602 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, distribution-scope=public) Dec 6 03:58:45 localhost podman[77643]: 2025-12-06 08:58:45.666072321 +0000 UTC m=+0.187969925 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, maintainer=OpenStack TripleO Team) Dec 6 03:58:45 localhost podman[77643]: 2025-12-06 08:58:45.67965017 +0000 UTC m=+0.201547804 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Dec 6 03:58:45 localhost podman[77643]: unhealthy Dec 6 03:58:45 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:58:45 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:58:45 localhost podman[77644]: 2025-12-06 08:58:45.693613921 +0000 UTC m=+0.205350342 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 03:58:45 localhost podman[77644]: unhealthy Dec 6 03:58:45 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:58:45 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:58:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:58:49 localhost podman[77703]: 2025-12-06 08:58:49.528806231 +0000 UTC m=+0.065917577 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc.) Dec 6 03:58:49 localhost podman[77703]: 2025-12-06 08:58:49.53818016 +0000 UTC m=+0.075291506 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, container_name=iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 03:58:49 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:58:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:58:52 localhost podman[77722]: 2025-12-06 08:58:52.521978143 +0000 UTC m=+0.056877847 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:58:52 localhost podman[77722]: 2025-12-06 08:58:52.56820326 +0000 UTC m=+0.103102984 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Dec 6 03:58:52 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:59:10 localhost systemd[1]: tmp-crun.ji57dt.mount: Deactivated successfully. Dec 6 03:59:10 localhost podman[77746]: 2025-12-06 08:59:10.577537588 +0000 UTC m=+0.097656255 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron) Dec 6 03:59:10 localhost podman[77746]: 2025-12-06 08:59:10.614241632 +0000 UTC m=+0.134360299 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1) Dec 6 03:59:10 localhost podman[77749]: 2025-12-06 08:59:10.628866224 +0000 UTC m=+0.140725526 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, version=17.1.12) Dec 6 03:59:10 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:59:10 localhost podman[77748]: 2025-12-06 08:59:10.677189155 +0000 UTC m=+0.189654897 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible) Dec 6 03:59:10 localhost podman[77748]: 2025-12-06 08:59:10.69810365 +0000 UTC m=+0.210569412 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Dec 6 03:59:10 localhost podman[77747]: 2025-12-06 08:59:10.722483033 +0000 UTC m=+0.238471023 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12) Dec 6 03:59:10 localhost podman[77747]: 2025-12-06 08:59:10.748655722 +0000 UTC m=+0.264643702 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z) Dec 6 03:59:10 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:59:10 localhost podman[77749]: 2025-12-06 08:59:10.809416127 +0000 UTC m=+0.321275429 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, release=1761123044) Dec 6 03:59:10 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:59:10 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:59:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:59:12 localhost systemd[1]: tmp-crun.yV3r9I.mount: Deactivated successfully. Dec 6 03:59:12 localhost podman[77842]: 2025-12-06 08:59:12.545071375 +0000 UTC m=+0.080149716 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z) Dec 6 03:59:12 localhost podman[77842]: 2025-12-06 08:59:12.872447793 +0000 UTC m=+0.407526234 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:59:12 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:59:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:59:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:59:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:59:16 localhost systemd[1]: tmp-crun.mD9Lyf.mount: Deactivated successfully. Dec 6 03:59:16 localhost systemd[1]: tmp-crun.qRQFtn.mount: Deactivated successfully. Dec 6 03:59:16 localhost podman[77867]: 2025-12-06 08:59:16.553550055 +0000 UTC m=+0.083460378 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Dec 6 03:59:16 localhost podman[77866]: 2025-12-06 08:59:16.530980268 +0000 UTC m=+0.066555676 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=) Dec 6 03:59:16 localhost podman[77865]: 2025-12-06 08:59:16.59322458 +0000 UTC m=+0.127646523 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true) Dec 6 03:59:16 localhost podman[77867]: 2025-12-06 08:59:16.596581233 +0000 UTC m=+0.126491556 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:59:16 localhost podman[77867]: unhealthy Dec 6 03:59:16 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:59:16 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:59:16 localhost podman[77866]: 2025-12-06 08:59:16.61105603 +0000 UTC m=+0.146631468 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:59:16 localhost podman[77866]: unhealthy Dec 6 03:59:16 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:59:16 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:59:16 localhost podman[77865]: 2025-12-06 08:59:16.631315605 +0000 UTC m=+0.165737598 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com) Dec 6 03:59:16 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:59:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:59:20 localhost podman[77925]: 2025-12-06 08:59:20.543569755 +0000 UTC m=+0.080013212 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:59:20 localhost podman[77925]: 2025-12-06 08:59:20.577355878 +0000 UTC m=+0.113799305 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid) Dec 6 03:59:20 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:59:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:59:23 localhost podman[77945]: 2025-12-06 08:59:23.542015149 +0000 UTC m=+0.070753675 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step5, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, vcs-type=git, container_name=nova_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 03:59:23 localhost podman[77945]: 2025-12-06 08:59:23.568794047 +0000 UTC m=+0.097532603 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, build-date=2025-11-19T00:36:58Z) Dec 6 03:59:23 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 03:59:41 localhost podman[77973]: 2025-12-06 08:59:41.562310606 +0000 UTC m=+0.085953685 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com) Dec 6 03:59:41 localhost podman[77974]: 2025-12-06 08:59:41.61297639 +0000 UTC m=+0.135502724 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Dec 6 03:59:41 localhost podman[77973]: 2025-12-06 08:59:41.619329246 +0000 UTC m=+0.142972345 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, version=17.1.12, container_name=ceilometer_agent_compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 03:59:41 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 03:59:41 localhost systemd[1]: tmp-crun.11OZXv.mount: Deactivated successfully. Dec 6 03:59:41 localhost podman[77971]: 2025-12-06 08:59:41.662986105 +0000 UTC m=+0.194243909 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1) Dec 6 03:59:41 localhost podman[77971]: 2025-12-06 08:59:41.676308046 +0000 UTC m=+0.207565890 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, release=1761123044, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4) Dec 6 03:59:41 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 03:59:41 localhost podman[77972]: 2025-12-06 08:59:41.753660414 +0000 UTC m=+0.281560464 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 03:59:41 localhost podman[77972]: 2025-12-06 08:59:41.80923515 +0000 UTC m=+0.337135190 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, architecture=x86_64, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 03:59:41 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 03:59:41 localhost podman[77974]: 2025-12-06 08:59:41.84033321 +0000 UTC m=+0.362859564 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.buildah.version=1.41.4, version=17.1.12) Dec 6 03:59:41 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 03:59:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 03:59:43 localhost systemd[1]: tmp-crun.JFCHkh.mount: Deactivated successfully. Dec 6 03:59:43 localhost podman[78070]: 2025-12-06 08:59:43.544186465 +0000 UTC m=+0.078763913 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, container_name=nova_migration_target, release=1761123044, io.buildah.version=1.41.4) Dec 6 03:59:43 localhost podman[78070]: 2025-12-06 08:59:43.907275586 +0000 UTC m=+0.441853064 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute) Dec 6 03:59:43 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 03:59:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 03:59:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 03:59:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 03:59:47 localhost podman[78094]: 2025-12-06 08:59:47.540131228 +0000 UTC m=+0.070840719 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Dec 6 03:59:47 localhost podman[78093]: 2025-12-06 08:59:47.602765912 +0000 UTC m=+0.136884458 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 03:59:47 localhost podman[78093]: 2025-12-06 08:59:47.611400288 +0000 UTC m=+0.145518814 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container) Dec 6 03:59:47 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 03:59:47 localhost podman[78094]: 2025-12-06 08:59:47.631197479 +0000 UTC m=+0.161906990 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 03:59:47 localhost podman[78094]: unhealthy Dec 6 03:59:47 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:59:47 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 03:59:47 localhost podman[78095]: 2025-12-06 08:59:47.711351635 +0000 UTC m=+0.237598477 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, url=https://www.redhat.com) Dec 6 03:59:47 localhost podman[78095]: 2025-12-06 08:59:47.725295095 +0000 UTC m=+0.251541937 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, version=17.1.12, vcs-type=git) Dec 6 03:59:47 localhost podman[78095]: unhealthy Dec 6 03:59:47 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 03:59:47 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 03:59:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 03:59:51 localhost podman[78153]: 2025-12-06 08:59:51.552084295 +0000 UTC m=+0.087639917 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, version=17.1.12, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public) Dec 6 03:59:51 localhost podman[78153]: 2025-12-06 08:59:51.564336583 +0000 UTC m=+0.099892205 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, vcs-type=git) Dec 6 03:59:51 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 03:59:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 03:59:54 localhost systemd[1]: tmp-crun.GaKKWB.mount: Deactivated successfully. Dec 6 03:59:54 localhost podman[78172]: 2025-12-06 08:59:54.548121345 +0000 UTC m=+0.082953421 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 03:59:54 localhost podman[78172]: 2025-12-06 08:59:54.57709029 +0000 UTC m=+0.111922366 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-type=git) Dec 6 03:59:54 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:00:12 localhost systemd[1]: tmp-crun.nC4oVa.mount: Deactivated successfully. Dec 6 04:00:12 localhost systemd[1]: tmp-crun.Ap3yUu.mount: Deactivated successfully. Dec 6 04:00:12 localhost podman[78205]: 2025-12-06 09:00:12.586993437 +0000 UTC m=+0.104422426 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Dec 6 04:00:12 localhost podman[78201]: 2025-12-06 09:00:12.549157618 +0000 UTC m=+0.073156180 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 6 04:00:12 localhost podman[78202]: 2025-12-06 09:00:12.569013671 +0000 UTC m=+0.086820932 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:00:12 localhost podman[78201]: 2025-12-06 09:00:12.633078079 +0000 UTC m=+0.157076661 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container) Dec 6 04:00:12 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:00:12 localhost podman[78202]: 2025-12-06 09:00:12.651161538 +0000 UTC m=+0.168968789 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc.) Dec 6 04:00:12 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:00:12 localhost podman[78209]: 2025-12-06 09:00:12.70662935 +0000 UTC m=+0.220926863 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:00:12 localhost podman[78205]: 2025-12-06 09:00:12.737000407 +0000 UTC m=+0.254429416 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git) Dec 6 04:00:12 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:00:12 localhost podman[78209]: 2025-12-06 09:00:12.882339235 +0000 UTC m=+0.396636798 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 04:00:12 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:00:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:00:14 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:00:14 localhost recover_tripleo_nova_virtqemud[78307]: 51836 Dec 6 04:00:14 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:00:14 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:00:14 localhost podman[78300]: 2025-12-06 09:00:14.551948583 +0000 UTC m=+0.072594542 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:00:14 localhost podman[78300]: 2025-12-06 09:00:14.943276695 +0000 UTC m=+0.463922724 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Dec 6 04:00:14 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:00:18 localhost podman[78325]: 2025-12-06 09:00:18.548944259 +0000 UTC m=+0.079531678 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12) Dec 6 04:00:18 localhost podman[78325]: 2025-12-06 09:00:18.563146647 +0000 UTC m=+0.093734046 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 04:00:18 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:00:18 localhost podman[78326]: 2025-12-06 09:00:18.616762462 +0000 UTC m=+0.145599466 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 6 04:00:18 localhost podman[78327]: 2025-12-06 09:00:18.566879612 +0000 UTC m=+0.091239868 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, vcs-type=git, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:00:18 localhost podman[78327]: 2025-12-06 09:00:18.6503626 +0000 UTC m=+0.174722886 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:00:18 localhost podman[78327]: unhealthy Dec 6 04:00:18 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:00:18 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:00:18 localhost podman[78326]: 2025-12-06 09:00:18.699224848 +0000 UTC m=+0.228061832 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 6 04:00:18 localhost podman[78326]: unhealthy Dec 6 04:00:18 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:00:18 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:00:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:00:22 localhost podman[78382]: 2025-12-06 09:00:22.544678454 +0000 UTC m=+0.072977574 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3) Dec 6 04:00:22 localhost podman[78382]: 2025-12-06 09:00:22.582213804 +0000 UTC m=+0.110512894 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3) Dec 6 04:00:22 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:00:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:00:25 localhost systemd[1]: tmp-crun.AyyKI6.mount: Deactivated successfully. Dec 6 04:00:25 localhost podman[78399]: 2025-12-06 09:00:25.554421509 +0000 UTC m=+0.089320069 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step5, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Dec 6 04:00:25 localhost podman[78399]: 2025-12-06 09:00:25.583125905 +0000 UTC m=+0.118024475 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Dec 6 04:00:25 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:00:43 localhost podman[78427]: 2025-12-06 09:00:43.578381418 +0000 UTC m=+0.099098770 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, version=17.1.12, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:00:43 localhost podman[78427]: 2025-12-06 09:00:43.602515844 +0000 UTC m=+0.123233196 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:00:43 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:00:43 localhost podman[78428]: 2025-12-06 09:00:43.641684813 +0000 UTC m=+0.163009284 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=metrics_qdr, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20251118.1) Dec 6 04:00:43 localhost systemd[1]: tmp-crun.u2xoTD.mount: Deactivated successfully. Dec 6 04:00:43 localhost podman[78426]: 2025-12-06 09:00:43.69079052 +0000 UTC m=+0.218279401 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, version=17.1.12) Dec 6 04:00:43 localhost podman[78425]: 2025-12-06 09:00:43.74781911 +0000 UTC m=+0.277415156 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:00:43 localhost podman[78426]: 2025-12-06 09:00:43.773967837 +0000 UTC m=+0.301456738 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 04:00:43 localhost podman[78425]: 2025-12-06 09:00:43.783241104 +0000 UTC m=+0.312837150 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, distribution-scope=public) Dec 6 04:00:43 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:00:43 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:00:43 localhost podman[78428]: 2025-12-06 09:00:43.819178403 +0000 UTC m=+0.340502884 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=) Dec 6 04:00:43 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:00:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:00:45 localhost systemd[1]: tmp-crun.HaNPxB.mount: Deactivated successfully. Dec 6 04:00:45 localhost podman[78530]: 2025-12-06 09:00:45.55613392 +0000 UTC m=+0.090893897 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 04:00:45 localhost podman[78530]: 2025-12-06 09:00:45.918271562 +0000 UTC m=+0.453031579 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:00:45 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:00:47 localhost systemd[1]: session-15.scope: Deactivated successfully. Dec 6 04:00:47 localhost systemd[1]: session-15.scope: Consumed 8min 1.170s CPU time. Dec 6 04:00:47 localhost systemd-logind[760]: Session 15 logged out. Waiting for processes to exit. Dec 6 04:00:47 localhost systemd-logind[760]: Removed session 15. Dec 6 04:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:00:49 localhost podman[78555]: 2025-12-06 09:00:49.552375242 +0000 UTC m=+0.082735685 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 6 04:00:49 localhost podman[78555]: 2025-12-06 09:00:49.591270333 +0000 UTC m=+0.121630766 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:00:49 localhost podman[78555]: unhealthy Dec 6 04:00:49 localhost podman[78556]: 2025-12-06 09:00:49.606082671 +0000 UTC m=+0.133793762 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible) Dec 6 04:00:49 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:00:49 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:00:49 localhost podman[78556]: 2025-12-06 09:00:49.648377936 +0000 UTC m=+0.176088997 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T23:34:05Z, release=1761123044) Dec 6 04:00:49 localhost podman[78556]: unhealthy Dec 6 04:00:49 localhost podman[78554]: 2025-12-06 09:00:49.660091918 +0000 UTC m=+0.193131814 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 6 04:00:49 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:00:49 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:00:49 localhost podman[78554]: 2025-12-06 09:00:49.675211785 +0000 UTC m=+0.208251731 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Dec 6 04:00:49 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:00:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:00:53 localhost podman[78611]: 2025-12-06 09:00:53.530425712 +0000 UTC m=+0.070101605 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:00:53 localhost podman[78611]: 2025-12-06 09:00:53.545292591 +0000 UTC m=+0.084968554 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12) Dec 6 04:00:53 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:00:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:00:56 localhost systemd[1]: tmp-crun.vFrk1m.mount: Deactivated successfully. Dec 6 04:00:56 localhost podman[78629]: 2025-12-06 09:00:56.547040089 +0000 UTC m=+0.082319073 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step5, release=1761123044, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4) Dec 6 04:00:56 localhost podman[78629]: 2025-12-06 09:00:56.577266753 +0000 UTC m=+0.112545777 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, container_name=nova_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step5, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 6 04:00:56 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:00:57 localhost systemd[1]: Stopping User Manager for UID 1002... Dec 6 04:00:57 localhost systemd[26313]: Activating special unit Exit the Session... Dec 6 04:00:57 localhost systemd[26313]: Removed slice User Background Tasks Slice. Dec 6 04:00:57 localhost systemd[26313]: Stopped target Main User Target. Dec 6 04:00:57 localhost systemd[26313]: Stopped target Basic System. Dec 6 04:00:57 localhost systemd[26313]: Stopped target Paths. Dec 6 04:00:57 localhost systemd[26313]: Stopped target Sockets. Dec 6 04:00:57 localhost systemd[26313]: Stopped target Timers. Dec 6 04:00:57 localhost systemd[26313]: Stopped Mark boot as successful after the user session has run 2 minutes. Dec 6 04:00:57 localhost systemd[26313]: Stopped Daily Cleanup of User's Temporary Directories. Dec 6 04:00:57 localhost systemd[26313]: Closed D-Bus User Message Bus Socket. Dec 6 04:00:57 localhost systemd[26313]: Stopped Create User's Volatile Files and Directories. Dec 6 04:00:57 localhost systemd[26313]: Removed slice User Application Slice. Dec 6 04:00:57 localhost systemd[26313]: Reached target Shutdown. Dec 6 04:00:57 localhost systemd[26313]: Finished Exit the Session. Dec 6 04:00:57 localhost systemd[26313]: Reached target Exit the Session. Dec 6 04:00:57 localhost systemd[1]: user@1002.service: Deactivated successfully. Dec 6 04:00:57 localhost systemd[1]: Stopped User Manager for UID 1002. Dec 6 04:00:57 localhost systemd[1]: user@1002.service: Consumed 4.277s CPU time, read 0B from disk, written 7.0K to disk. Dec 6 04:00:57 localhost systemd[1]: Stopping User Runtime Directory /run/user/1002... Dec 6 04:00:57 localhost systemd[1]: run-user-1002.mount: Deactivated successfully. Dec 6 04:00:57 localhost systemd[1]: user-runtime-dir@1002.service: Deactivated successfully. Dec 6 04:00:57 localhost systemd[1]: Stopped User Runtime Directory /run/user/1002. Dec 6 04:00:57 localhost systemd[1]: Removed slice User Slice of UID 1002. Dec 6 04:00:57 localhost systemd[1]: user-1002.slice: Consumed 8min 5.479s CPU time. Dec 6 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:01:14 localhost podman[78684]: 2025-12-06 09:01:14.565704805 +0000 UTC m=+0.089409852 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, io.openshift.expose-services=, container_name=logrotate_crond, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z) Dec 6 04:01:14 localhost systemd[1]: tmp-crun.K8ihMq.mount: Deactivated successfully. Dec 6 04:01:14 localhost podman[78684]: 2025-12-06 09:01:14.602383387 +0000 UTC m=+0.126088394 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true) Dec 6 04:01:14 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:01:14 localhost podman[78685]: 2025-12-06 09:01:14.649939036 +0000 UTC m=+0.174860781 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:01:14 localhost podman[78687]: 2025-12-06 09:01:14.655444966 +0000 UTC m=+0.170406393 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12) Dec 6 04:01:14 localhost podman[78685]: 2025-12-06 09:01:14.675086762 +0000 UTC m=+0.200008477 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 6 04:01:14 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:01:14 localhost podman[78686]: 2025-12-06 09:01:14.608488106 +0000 UTC m=+0.129955413 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team) Dec 6 04:01:14 localhost podman[78686]: 2025-12-06 09:01:14.743209615 +0000 UTC m=+0.264676882 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible) Dec 6 04:01:14 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:01:14 localhost podman[78687]: 2025-12-06 09:01:14.894444844 +0000 UTC m=+0.409406301 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible) Dec 6 04:01:14 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:01:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:01:16 localhost podman[78783]: 2025-12-06 09:01:16.55812616 +0000 UTC m=+0.088645249 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:01:16 localhost podman[78783]: 2025-12-06 09:01:16.92543208 +0000 UTC m=+0.455951199 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4) Dec 6 04:01:16 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:01:20 localhost systemd[1]: tmp-crun.khVTe6.mount: Deactivated successfully. Dec 6 04:01:20 localhost podman[78809]: 2025-12-06 09:01:20.550147711 +0000 UTC m=+0.081703203 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller) Dec 6 04:01:20 localhost podman[78809]: 2025-12-06 09:01:20.59348455 +0000 UTC m=+0.125040072 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=) Dec 6 04:01:20 localhost podman[78809]: unhealthy Dec 6 04:01:20 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:01:20 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:01:20 localhost podman[78808]: 2025-12-06 09:01:20.641550194 +0000 UTC m=+0.172926131 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:01:20 localhost podman[78808]: 2025-12-06 09:01:20.685423987 +0000 UTC m=+0.216799954 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 6 04:01:20 localhost podman[78808]: unhealthy Dec 6 04:01:20 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:01:20 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:01:20 localhost podman[78807]: 2025-12-06 09:01:20.593227481 +0000 UTC m=+0.126911889 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Dec 6 04:01:20 localhost podman[78807]: 2025-12-06 09:01:20.729356754 +0000 UTC m=+0.263041222 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 6 04:01:20 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:01:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:01:24 localhost podman[78867]: 2025-12-06 09:01:24.548010213 +0000 UTC m=+0.084035556 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044) Dec 6 04:01:24 localhost podman[78867]: 2025-12-06 09:01:24.554777552 +0000 UTC m=+0.090802935 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com) Dec 6 04:01:24 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:01:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:01:27 localhost systemd[1]: tmp-crun.qivgnv.mount: Deactivated successfully. Dec 6 04:01:27 localhost podman[78886]: 2025-12-06 09:01:27.564227947 +0000 UTC m=+0.092530108 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Dec 6 04:01:27 localhost podman[78886]: 2025-12-06 09:01:27.586065612 +0000 UTC m=+0.114367853 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc.) Dec 6 04:01:27 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:01:45 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:01:45 localhost recover_tripleo_nova_virtqemud[78938]: 51836 Dec 6 04:01:45 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:01:45 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:01:45 localhost systemd[1]: tmp-crun.pG10uM.mount: Deactivated successfully. Dec 6 04:01:45 localhost podman[78912]: 2025-12-06 09:01:45.577408755 +0000 UTC m=+0.114144876 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, container_name=logrotate_crond, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:01:45 localhost podman[78920]: 2025-12-06 09:01:45.623674602 +0000 UTC m=+0.149058532 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:01:45 localhost podman[78913]: 2025-12-06 09:01:45.67896725 +0000 UTC m=+0.209385296 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1) Dec 6 04:01:45 localhost podman[78912]: 2025-12-06 09:01:45.689044721 +0000 UTC m=+0.225780872 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:01:45 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:01:45 localhost podman[78913]: 2025-12-06 09:01:45.732215864 +0000 UTC m=+0.262633920 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12) Dec 6 04:01:45 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:01:45 localhost podman[78914]: 2025-12-06 09:01:45.772602651 +0000 UTC m=+0.299392104 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible) Dec 6 04:01:45 localhost podman[78914]: 2025-12-06 09:01:45.802304078 +0000 UTC m=+0.329093621 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc.) Dec 6 04:01:45 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:01:45 localhost podman[78920]: 2025-12-06 09:01:45.822212502 +0000 UTC m=+0.347596352 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:01:45 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:01:46 localhost systemd[1]: tmp-crun.wnYHBE.mount: Deactivated successfully. Dec 6 04:01:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:01:47 localhost podman[79014]: 2025-12-06 09:01:47.546637704 +0000 UTC m=+0.079065142 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:36:58Z) Dec 6 04:01:47 localhost podman[79014]: 2025-12-06 09:01:47.936439168 +0000 UTC m=+0.468866526 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_id=tripleo_step4) Dec 6 04:01:47 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:01:51 localhost podman[79039]: 2025-12-06 09:01:51.539925513 +0000 UTC m=+0.065616206 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:01:51 localhost podman[79039]: 2025-12-06 09:01:51.584325844 +0000 UTC m=+0.110016557 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, container_name=ovn_controller) Dec 6 04:01:51 localhost podman[79039]: unhealthy Dec 6 04:01:51 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:01:51 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:01:51 localhost podman[79038]: 2025-12-06 09:01:51.652302823 +0000 UTC m=+0.179002247 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z) Dec 6 04:01:51 localhost podman[79038]: 2025-12-06 09:01:51.692897907 +0000 UTC m=+0.219597321 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, url=https://www.redhat.com, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-type=git) Dec 6 04:01:51 localhost podman[79038]: unhealthy Dec 6 04:01:51 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:01:51 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:01:51 localhost podman[79037]: 2025-12-06 09:01:51.711007255 +0000 UTC m=+0.241228638 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:01:51 localhost podman[79037]: 2025-12-06 09:01:51.723194232 +0000 UTC m=+0.253415605 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd) Dec 6 04:01:51 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:01:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:01:55 localhost systemd[1]: tmp-crun.5QkdvN.mount: Deactivated successfully. Dec 6 04:01:55 localhost podman[79097]: 2025-12-06 09:01:55.539686104 +0000 UTC m=+0.073999645 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, container_name=iscsid, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 04:01:55 localhost podman[79097]: 2025-12-06 09:01:55.549288581 +0000 UTC m=+0.083602152 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 04:01:55 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:01:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:01:58 localhost podman[79114]: 2025-12-06 09:01:58.545411522 +0000 UTC m=+0.080683981 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:01:58 localhost podman[79114]: 2025-12-06 09:01:58.575298554 +0000 UTC m=+0.110571033 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=) Dec 6 04:01:58 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:02:16 localhost systemd[1]: tmp-crun.BZhlaf.mount: Deactivated successfully. Dec 6 04:02:16 localhost podman[79144]: 2025-12-06 09:02:16.569895747 +0000 UTC m=+0.090173233 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 04:02:16 localhost podman[79150]: 2025-12-06 09:02:16.62700229 +0000 UTC m=+0.142306323 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com) Dec 6 04:02:16 localhost podman[79144]: 2025-12-06 09:02:16.654234039 +0000 UTC m=+0.174511535 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 04:02:16 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:02:16 localhost podman[79142]: 2025-12-06 09:02:16.578873674 +0000 UTC m=+0.107249330 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4) Dec 6 04:02:16 localhost podman[79143]: 2025-12-06 09:02:16.605222527 +0000 UTC m=+0.129918820 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Dec 6 04:02:16 localhost podman[79142]: 2025-12-06 09:02:16.711222599 +0000 UTC m=+0.239598275 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 6 04:02:16 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:02:16 localhost podman[79143]: 2025-12-06 09:02:16.733686902 +0000 UTC m=+0.258383105 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, version=17.1.12, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi) Dec 6 04:02:16 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:02:16 localhost podman[79150]: 2025-12-06 09:02:16.843262993 +0000 UTC m=+0.358567006 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044) Dec 6 04:02:16 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:02:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:02:18 localhost podman[79244]: 2025-12-06 09:02:18.55688026 +0000 UTC m=+0.088437199 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4) Dec 6 04:02:18 localhost podman[79244]: 2025-12-06 09:02:18.939404035 +0000 UTC m=+0.470961054 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Dec 6 04:02:18 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:02:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:02:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:02:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:02:22 localhost podman[79272]: 2025-12-06 09:02:22.541565399 +0000 UTC m=+0.069576608 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:02:22 localhost podman[79271]: 2025-12-06 09:02:22.600616611 +0000 UTC m=+0.131116267 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64) Dec 6 04:02:22 localhost podman[79271]: 2025-12-06 09:02:22.642181534 +0000 UTC m=+0.172681180 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1) Dec 6 04:02:22 localhost podman[79271]: unhealthy Dec 6 04:02:22 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:02:22 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:02:22 localhost podman[79270]: 2025-12-06 09:02:22.658142656 +0000 UTC m=+0.192030406 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd) Dec 6 04:02:22 localhost podman[79270]: 2025-12-06 09:02:22.665364809 +0000 UTC m=+0.199252499 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com) Dec 6 04:02:22 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:02:22 localhost podman[79272]: 2025-12-06 09:02:22.682542669 +0000 UTC m=+0.210553918 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true) Dec 6 04:02:22 localhost podman[79272]: unhealthy Dec 6 04:02:22 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:02:22 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:02:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:02:26 localhost podman[79329]: 2025-12-06 09:02:26.539127705 +0000 UTC m=+0.074562042 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, url=https://www.redhat.com) Dec 6 04:02:26 localhost podman[79329]: 2025-12-06 09:02:26.577303603 +0000 UTC m=+0.112737910 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 6 04:02:26 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:02:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:02:29 localhost podman[79349]: 2025-12-06 09:02:29.533045932 +0000 UTC m=+0.069842496 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:02:29 localhost podman[79349]: 2025-12-06 09:02:29.558058333 +0000 UTC m=+0.094854837 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_compute, maintainer=OpenStack TripleO Team) Dec 6 04:02:29 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:02:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:02:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:02:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:02:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:02:47 localhost podman[79376]: 2025-12-06 09:02:47.556783342 +0000 UTC m=+0.082500606 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.expose-services=) Dec 6 04:02:47 localhost podman[79376]: 2025-12-06 09:02:47.570196126 +0000 UTC m=+0.095913430 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, version=17.1.12, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z) Dec 6 04:02:47 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:02:47 localhost systemd[1]: tmp-crun.WJ7AhH.mount: Deactivated successfully. Dec 6 04:02:47 localhost podman[79377]: 2025-12-06 09:02:47.660867405 +0000 UTC m=+0.182922297 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1) Dec 6 04:02:47 localhost podman[79378]: 2025-12-06 09:02:47.628912758 +0000 UTC m=+0.147477081 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:02:47 localhost podman[79377]: 2025-12-06 09:02:47.689822668 +0000 UTC m=+0.211877560 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4) Dec 6 04:02:47 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:02:47 localhost podman[79378]: 2025-12-06 09:02:47.712351903 +0000 UTC m=+0.230916226 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 04:02:47 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:02:47 localhost podman[79380]: 2025-12-06 09:02:47.768995851 +0000 UTC m=+0.284462059 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:02:47 localhost podman[79380]: 2025-12-06 09:02:47.971072946 +0000 UTC m=+0.486539204 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:02:47 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:02:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:02:49 localhost podman[79475]: 2025-12-06 09:02:49.588121656 +0000 UTC m=+0.083552079 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.buildah.version=1.41.4) Dec 6 04:02:49 localhost podman[79475]: 2025-12-06 09:02:49.953949204 +0000 UTC m=+0.449379567 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, tcib_managed=true) Dec 6 04:02:49 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:02:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:02:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:02:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:02:53 localhost systemd[1]: tmp-crun.CENKiS.mount: Deactivated successfully. Dec 6 04:02:53 localhost podman[79500]: 2025-12-06 09:02:53.566281343 +0000 UTC m=+0.089019438 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, distribution-scope=public, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 6 04:02:53 localhost podman[79500]: 2025-12-06 09:02:53.581352178 +0000 UTC m=+0.104090273 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 04:02:53 localhost podman[79500]: unhealthy Dec 6 04:02:53 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:02:53 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:02:53 localhost podman[79499]: 2025-12-06 09:02:53.669643331 +0000 UTC m=+0.196064541 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:02:53 localhost podman[79499]: 2025-12-06 09:02:53.676544115 +0000 UTC m=+0.202965395 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:02:53 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:02:53 localhost podman[79501]: 2025-12-06 09:02:53.716136497 +0000 UTC m=+0.235073035 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T23:34:05Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 6 04:02:53 localhost podman[79501]: 2025-12-06 09:02:53.726770975 +0000 UTC m=+0.245707533 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 04:02:53 localhost podman[79501]: unhealthy Dec 6 04:02:53 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:02:53 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:02:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:02:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:02:57 localhost recover_tripleo_nova_virtqemud[79560]: 51836 Dec 6 04:02:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:02:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:02:57 localhost podman[79558]: 2025-12-06 09:02:57.546168402 +0000 UTC m=+0.081017621 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12) Dec 6 04:02:57 localhost podman[79558]: 2025-12-06 09:02:57.584211667 +0000 UTC m=+0.119060846 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 6 04:02:57 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:03:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:03:00 localhost podman[79579]: 2025-12-06 09:03:00.546098513 +0000 UTC m=+0.081278749 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 04:03:00 localhost podman[79579]: 2025-12-06 09:03:00.601250605 +0000 UTC m=+0.136430801 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_compute) Dec 6 04:03:00 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:03:18 localhost podman[79609]: 2025-12-06 09:03:18.574078796 +0000 UTC m=+0.086647454 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Dec 6 04:03:18 localhost podman[79606]: 2025-12-06 09:03:18.620443787 +0000 UTC m=+0.141523148 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 04:03:18 localhost systemd[1]: tmp-crun.nRdNSX.mount: Deactivated successfully. Dec 6 04:03:18 localhost podman[79607]: 2025-12-06 09:03:18.682748689 +0000 UTC m=+0.200427865 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 6 04:03:18 localhost podman[79608]: 2025-12-06 09:03:18.730158752 +0000 UTC m=+0.245073743 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:03:18 localhost podman[79607]: 2025-12-06 09:03:18.753314196 +0000 UTC m=+0.270993372 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:03:18 localhost podman[79608]: 2025-12-06 09:03:18.756351201 +0000 UTC m=+0.271266192 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git) Dec 6 04:03:18 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:03:18 localhost podman[79609]: 2025-12-06 09:03:18.77025347 +0000 UTC m=+0.282822178 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 04:03:18 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:03:18 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:03:18 localhost podman[79606]: 2025-12-06 09:03:18.859493033 +0000 UTC m=+0.380572494 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:03:18 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:03:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:03:20 localhost systemd[1]: tmp-crun.NTyTG7.mount: Deactivated successfully. Dec 6 04:03:20 localhost podman[79705]: 2025-12-06 09:03:20.554426145 +0000 UTC m=+0.085539940 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 04:03:20 localhost podman[79705]: 2025-12-06 09:03:20.963287751 +0000 UTC m=+0.494401596 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_migration_target, version=17.1.12) Dec 6 04:03:20 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:03:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:03:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:03:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:03:24 localhost systemd[1]: tmp-crun.W2R9id.mount: Deactivated successfully. Dec 6 04:03:24 localhost podman[79732]: 2025-12-06 09:03:24.545177191 +0000 UTC m=+0.074441038 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, version=17.1.12, name=rhosp17/openstack-ovn-controller, vcs-type=git, architecture=x86_64) Dec 6 04:03:24 localhost podman[79730]: 2025-12-06 09:03:24.607246626 +0000 UTC m=+0.142280281 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible) Dec 6 04:03:24 localhost podman[79730]: 2025-12-06 09:03:24.619315679 +0000 UTC m=+0.154349374 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=collectd, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Dec 6 04:03:24 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:03:24 localhost podman[79732]: 2025-12-06 09:03:24.638493471 +0000 UTC m=+0.167757308 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64) Dec 6 04:03:24 localhost podman[79732]: unhealthy Dec 6 04:03:24 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:03:24 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:03:24 localhost podman[79731]: 2025-12-06 09:03:24.747612558 +0000 UTC m=+0.278220106 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 04:03:24 localhost podman[79731]: 2025-12-06 09:03:24.765371336 +0000 UTC m=+0.295978884 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, tcib_managed=true, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Dec 6 04:03:24 localhost podman[79731]: unhealthy Dec 6 04:03:24 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:03:24 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:03:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:03:28 localhost podman[79790]: 2025-12-06 09:03:28.545076759 +0000 UTC m=+0.079360890 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid) Dec 6 04:03:28 localhost podman[79790]: 2025-12-06 09:03:28.560193996 +0000 UTC m=+0.094478137 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=) Dec 6 04:03:28 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:03:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:03:32 localhost podman[79809]: 2025-12-06 09:03:32.447822698 +0000 UTC m=+0.983235941 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 04:03:32 localhost podman[79809]: 2025-12-06 09:03:32.474204852 +0000 UTC m=+1.009618065 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:03:32 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:03:49 localhost systemd[1]: tmp-crun.JC4rUH.mount: Deactivated successfully. Dec 6 04:03:49 localhost podman[79844]: 2025-12-06 09:03:49.568105841 +0000 UTC m=+0.088772440 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:03:49 localhost podman[79836]: 2025-12-06 09:03:49.541906963 +0000 UTC m=+0.072343823 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_ipmi, tcib_managed=true, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com) Dec 6 04:03:49 localhost podman[79835]: 2025-12-06 09:03:49.606751174 +0000 UTC m=+0.139416313 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:03:49 localhost podman[79835]: 2025-12-06 09:03:49.614676108 +0000 UTC m=+0.147341197 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, tcib_managed=true) Dec 6 04:03:49 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:03:49 localhost podman[79837]: 2025-12-06 09:03:49.654613051 +0000 UTC m=+0.179101568 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:03:49 localhost podman[79836]: 2025-12-06 09:03:49.674871045 +0000 UTC m=+0.205307915 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:03:49 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:03:49 localhost podman[79837]: 2025-12-06 09:03:49.729980916 +0000 UTC m=+0.254469443 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-type=git) Dec 6 04:03:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:03:49 localhost podman[79844]: 2025-12-06 09:03:49.748002572 +0000 UTC m=+0.268669161 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, tcib_managed=true, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 04:03:49 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:03:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:03:51 localhost podman[79937]: 2025-12-06 09:03:51.543423595 +0000 UTC m=+0.077360237 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 6 04:03:51 localhost podman[79937]: 2025-12-06 09:03:51.912733412 +0000 UTC m=+0.446670004 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc.) Dec 6 04:03:51 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:03:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:03:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:03:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:03:55 localhost podman[79960]: 2025-12-06 09:03:55.563736964 +0000 UTC m=+0.093406644 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12) Dec 6 04:03:55 localhost podman[79960]: 2025-12-06 09:03:55.596497275 +0000 UTC m=+0.126166884 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, release=1761123044, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true) Dec 6 04:03:55 localhost podman[79961]: 2025-12-06 09:03:55.604970897 +0000 UTC m=+0.132516940 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Dec 6 04:03:55 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:03:55 localhost podman[79961]: 2025-12-06 09:03:55.621560579 +0000 UTC m=+0.149106592 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64) Dec 6 04:03:55 localhost podman[79961]: unhealthy Dec 6 04:03:55 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:03:55 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:03:55 localhost podman[79962]: 2025-12-06 09:03:55.67215805 +0000 UTC m=+0.194320167 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ovn_controller, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git) Dec 6 04:03:55 localhost podman[79962]: 2025-12-06 09:03:55.692145217 +0000 UTC m=+0.214307304 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64) Dec 6 04:03:55 localhost podman[79962]: unhealthy Dec 6 04:03:55 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:03:55 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:03:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:03:59 localhost systemd[1]: tmp-crun.frMmLk.mount: Deactivated successfully. Dec 6 04:03:59 localhost podman[80018]: 2025-12-06 09:03:59.546884005 +0000 UTC m=+0.082263509 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git) Dec 6 04:03:59 localhost podman[80018]: 2025-12-06 09:03:59.58430985 +0000 UTC m=+0.119689384 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, architecture=x86_64, container_name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:03:59 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:04:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:04:03 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:04:03 localhost recover_tripleo_nova_virtqemud[80040]: 51836 Dec 6 04:04:03 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:04:03 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:04:03 localhost podman[80038]: 2025-12-06 09:04:03.563081235 +0000 UTC m=+0.089546364 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=) Dec 6 04:04:03 localhost podman[80038]: 2025-12-06 09:04:03.593606927 +0000 UTC m=+0.120072006 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:04:03 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:04:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:04:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:04:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:04:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:04:20 localhost podman[80066]: 2025-12-06 09:04:20.57806072 +0000 UTC m=+0.109545252 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Dec 6 04:04:20 localhost systemd[1]: tmp-crun.DyhrOz.mount: Deactivated successfully. Dec 6 04:04:20 localhost podman[80068]: 2025-12-06 09:04:20.627122543 +0000 UTC m=+0.151074473 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:04:20 localhost podman[80067]: 2025-12-06 09:04:20.67206143 +0000 UTC m=+0.196051281 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Dec 6 04:04:20 localhost podman[80074]: 2025-12-06 09:04:20.722269509 +0000 UTC m=+0.241666208 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 04:04:20 localhost podman[80066]: 2025-12-06 09:04:20.741700039 +0000 UTC m=+0.273184601 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4) Dec 6 04:04:20 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:04:20 localhost podman[80068]: 2025-12-06 09:04:20.775976966 +0000 UTC m=+0.299928896 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 04:04:20 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:04:20 localhost podman[80067]: 2025-12-06 09:04:20.798262484 +0000 UTC m=+0.322252365 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public) Dec 6 04:04:20 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:04:20 localhost podman[80074]: 2025-12-06 09:04:20.993024904 +0000 UTC m=+0.512421563 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, architecture=x86_64) Dec 6 04:04:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:04:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:04:22 localhost podman[80169]: 2025-12-06 09:04:22.563273758 +0000 UTC m=+0.088383168 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, version=17.1.12) Dec 6 04:04:22 localhost podman[80169]: 2025-12-06 09:04:22.926171397 +0000 UTC m=+0.451280797 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true) Dec 6 04:04:22 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:04:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:04:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:04:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:04:26 localhost systemd[1]: tmp-crun.XEw9Qs.mount: Deactivated successfully. Dec 6 04:04:26 localhost podman[80193]: 2025-12-06 09:04:26.567347005 +0000 UTC m=+0.100389069 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:04:26 localhost podman[80193]: 2025-12-06 09:04:26.609755494 +0000 UTC m=+0.142797538 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 04:04:26 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:04:26 localhost podman[80195]: 2025-12-06 09:04:26.612174668 +0000 UTC m=+0.140724364 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, batch=17.1_20251118.1, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 6 04:04:26 localhost podman[80195]: 2025-12-06 09:04:26.698305415 +0000 UTC m=+0.226855131 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4) Dec 6 04:04:26 localhost podman[80195]: unhealthy Dec 6 04:04:26 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:04:26 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:04:26 localhost podman[80194]: 2025-12-06 09:04:26.665444842 +0000 UTC m=+0.195479513 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 6 04:04:26 localhost podman[80194]: 2025-12-06 09:04:26.745732239 +0000 UTC m=+0.275766870 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent) Dec 6 04:04:26 localhost podman[80194]: unhealthy Dec 6 04:04:26 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:04:26 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:04:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:04:30 localhost podman[80247]: 2025-12-06 09:04:30.53539195 +0000 UTC m=+0.072413076 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, release=1761123044, container_name=iscsid, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 04:04:30 localhost podman[80247]: 2025-12-06 09:04:30.547247836 +0000 UTC m=+0.084268952 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:04:30 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:04:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:04:34 localhost podman[80267]: 2025-12-06 09:04:34.539708844 +0000 UTC m=+0.077422610 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Dec 6 04:04:34 localhost podman[80267]: 2025-12-06 09:04:34.593451743 +0000 UTC m=+0.131165569 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Dec 6 04:04:34 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:04:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:04:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:04:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:04:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:04:51 localhost podman[80294]: 2025-12-06 09:04:51.569287906 +0000 UTC m=+0.091341660 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, vcs-type=git, distribution-scope=public, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 6 04:04:51 localhost podman[80294]: 2025-12-06 09:04:51.59925654 +0000 UTC m=+0.121310264 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:04:51 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:04:51 localhost podman[80293]: 2025-12-06 09:04:51.674591244 +0000 UTC m=+0.203872352 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:04:51 localhost podman[80293]: 2025-12-06 09:04:51.682324793 +0000 UTC m=+0.211605931 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z) Dec 6 04:04:51 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:04:51 localhost podman[80295]: 2025-12-06 09:04:51.736424822 +0000 UTC m=+0.255950968 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 6 04:04:51 localhost podman[80299]: 2025-12-06 09:04:51.779333306 +0000 UTC m=+0.295841549 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:04:51 localhost podman[80295]: 2025-12-06 09:04:51.796399434 +0000 UTC m=+0.315925570 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 04:04:51 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:04:52 localhost podman[80299]: 2025-12-06 09:04:52.034346226 +0000 UTC m=+0.550854509 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}) Dec 6 04:04:52 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:04:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:04:53 localhost systemd[1]: tmp-crun.zfSwsm.mount: Deactivated successfully. Dec 6 04:04:53 localhost podman[80395]: 2025-12-06 09:04:53.562775189 +0000 UTC m=+0.091810574 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true) Dec 6 04:04:53 localhost podman[80395]: 2025-12-06 09:04:53.942187848 +0000 UTC m=+0.471223183 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, version=17.1.12, io.buildah.version=1.41.4) Dec 6 04:04:53 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:04:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:04:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:04:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:04:57 localhost podman[80418]: 2025-12-06 09:04:57.552167824 +0000 UTC m=+0.080998531 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:04:57 localhost podman[80418]: 2025-12-06 09:04:57.563283077 +0000 UTC m=+0.092113794 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, version=17.1.12, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1) Dec 6 04:04:57 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:04:57 localhost podman[80419]: 2025-12-06 09:04:57.62428681 +0000 UTC m=+0.142656924 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:04:57 localhost podman[80419]: 2025-12-06 09:04:57.64016075 +0000 UTC m=+0.158530864 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z) Dec 6 04:04:57 localhost podman[80419]: unhealthy Dec 6 04:04:57 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:04:57 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:04:57 localhost podman[80420]: 2025-12-06 09:04:57.712616745 +0000 UTC m=+0.231403361 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=) Dec 6 04:04:57 localhost podman[80420]: 2025-12-06 09:04:57.732421997 +0000 UTC m=+0.251208653 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12) Dec 6 04:04:57 localhost podman[80420]: unhealthy Dec 6 04:04:57 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:04:57 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:05:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:05:01 localhost podman[80480]: 2025-12-06 09:05:01.548604695 +0000 UTC m=+0.082832727 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:05:01 localhost podman[80480]: 2025-12-06 09:05:01.562571506 +0000 UTC m=+0.096799538 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Dec 6 04:05:01 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:05:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:05:05 localhost podman[80499]: 2025-12-06 09:05:05.54549238 +0000 UTC m=+0.078507453 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_id=tripleo_step5, container_name=nova_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:05:05 localhost podman[80499]: 2025-12-06 09:05:05.598219667 +0000 UTC m=+0.131234720 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_id=tripleo_step5, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, version=17.1.12, container_name=nova_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:05:05 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:05:12 localhost sshd[80525]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:05:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:05:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:05:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:05:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:05:22 localhost systemd[1]: tmp-crun.1sSPrv.mount: Deactivated successfully. Dec 6 04:05:22 localhost podman[80527]: 2025-12-06 09:05:22.55911037 +0000 UTC m=+0.087734729 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z) Dec 6 04:05:22 localhost podman[80527]: 2025-12-06 09:05:22.586146594 +0000 UTC m=+0.114770943 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:05:22 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:05:22 localhost systemd[1]: tmp-crun.ojyyQY.mount: Deactivated successfully. Dec 6 04:05:22 localhost podman[80526]: 2025-12-06 09:05:22.663417899 +0000 UTC m=+0.195584157 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=logrotate_crond, maintainer=OpenStack TripleO Team) Dec 6 04:05:22 localhost podman[80526]: 2025-12-06 09:05:22.668813075 +0000 UTC m=+0.200979343 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:05:22 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:05:22 localhost podman[80530]: 2025-12-06 09:05:22.717682273 +0000 UTC m=+0.240458200 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Dec 6 04:05:22 localhost podman[80528]: 2025-12-06 09:05:22.759824483 +0000 UTC m=+0.285728198 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute) Dec 6 04:05:22 localhost podman[80528]: 2025-12-06 09:05:22.804502472 +0000 UTC m=+0.330406187 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.12) Dec 6 04:05:22 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:05:22 localhost podman[80530]: 2025-12-06 09:05:22.91332697 +0000 UTC m=+0.436102897 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Dec 6 04:05:22 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:05:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:05:24 localhost podman[80628]: 2025-12-06 09:05:24.545512726 +0000 UTC m=+0.075827260 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, release=1761123044, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 04:05:24 localhost podman[80628]: 2025-12-06 09:05:24.920353613 +0000 UTC m=+0.450668177 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=) Dec 6 04:05:24 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:05:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:05:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:05:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:05:28 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:05:28 localhost recover_tripleo_nova_virtqemud[80665]: 51836 Dec 6 04:05:28 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:05:28 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:05:28 localhost podman[80651]: 2025-12-06 09:05:28.548824369 +0000 UTC m=+0.082082363 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com) Dec 6 04:05:28 localhost podman[80651]: 2025-12-06 09:05:28.559353584 +0000 UTC m=+0.092611668 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Dec 6 04:05:28 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:05:28 localhost podman[80653]: 2025-12-06 09:05:28.60915655 +0000 UTC m=+0.135808481 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 6 04:05:28 localhost podman[80652]: 2025-12-06 09:05:28.651625802 +0000 UTC m=+0.180282205 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.12, architecture=x86_64, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:05:28 localhost podman[80652]: 2025-12-06 09:05:28.670174683 +0000 UTC m=+0.198831036 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:05:28 localhost podman[80652]: unhealthy Dec 6 04:05:28 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:05:28 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:05:28 localhost podman[80653]: 2025-12-06 09:05:28.703688018 +0000 UTC m=+0.230339979 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}) Dec 6 04:05:28 localhost podman[80653]: unhealthy Dec 6 04:05:28 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:05:28 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:05:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:05:32 localhost systemd[1]: tmp-crun.L0d5Qh.mount: Deactivated successfully. Dec 6 04:05:32 localhost podman[80712]: 2025-12-06 09:05:32.541759571 +0000 UTC m=+0.076797620 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Dec 6 04:05:32 localhost podman[80712]: 2025-12-06 09:05:32.554065032 +0000 UTC m=+0.089103071 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, container_name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com) Dec 6 04:05:32 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:05:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:05:36 localhost podman[80730]: 2025-12-06 09:05:36.536868051 +0000 UTC m=+0.071670552 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, architecture=x86_64, container_name=nova_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Dec 6 04:05:36 localhost podman[80730]: 2025-12-06 09:05:36.573154501 +0000 UTC m=+0.107956972 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Dec 6 04:05:36 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:05:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:05:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:05:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:05:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:05:53 localhost podman[80755]: 2025-12-06 09:05:53.56492 +0000 UTC m=+0.095915061 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-cron, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4) Dec 6 04:05:53 localhost podman[80755]: 2025-12-06 09:05:53.601400075 +0000 UTC m=+0.132395116 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, architecture=x86_64) Dec 6 04:05:53 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:05:53 localhost podman[80756]: 2025-12-06 09:05:53.62489507 +0000 UTC m=+0.150359010 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:05:53 localhost podman[80756]: 2025-12-06 09:05:53.677287537 +0000 UTC m=+0.202751487 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044) Dec 6 04:05:53 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:05:53 localhost podman[80762]: 2025-12-06 09:05:53.68095816 +0000 UTC m=+0.200162347 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=metrics_qdr, config_id=tripleo_step1, release=1761123044, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 04:05:53 localhost podman[80757]: 2025-12-06 09:05:53.731539601 +0000 UTC m=+0.252816872 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:05:53 localhost podman[80757]: 2025-12-06 09:05:53.812109977 +0000 UTC m=+0.333387278 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z) Dec 6 04:05:53 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:05:53 localhost podman[80762]: 2025-12-06 09:05:53.879346142 +0000 UTC m=+0.398550369 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:05:53 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:05:54 localhost systemd[1]: tmp-crun.iWuN3G.mount: Deactivated successfully. Dec 6 04:05:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:05:55 localhost systemd[1]: tmp-crun.rjDebo.mount: Deactivated successfully. Dec 6 04:05:55 localhost podman[80859]: 2025-12-06 09:05:55.547741206 +0000 UTC m=+0.083081345 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=) Dec 6 04:05:55 localhost podman[80859]: 2025-12-06 09:05:55.907292231 +0000 UTC m=+0.442632360 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute) Dec 6 04:05:55 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:05:59 localhost systemd[1]: tmp-crun.fVF4Ra.mount: Deactivated successfully. Dec 6 04:05:59 localhost podman[80882]: 2025-12-06 09:05:59.561963506 +0000 UTC m=+0.088436731 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd) Dec 6 04:05:59 localhost systemd[1]: tmp-crun.D46mGw.mount: Deactivated successfully. Dec 6 04:05:59 localhost podman[80883]: 2025-12-06 09:05:59.612400072 +0000 UTC m=+0.134490891 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 04:05:59 localhost podman[80883]: 2025-12-06 09:05:59.619573103 +0000 UTC m=+0.141663912 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:05:59 localhost podman[80883]: unhealthy Dec 6 04:05:59 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:05:59 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:05:59 localhost podman[80884]: 2025-12-06 09:05:59.664122078 +0000 UTC m=+0.180419748 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}) Dec 6 04:05:59 localhost podman[80884]: 2025-12-06 09:05:59.677168971 +0000 UTC m=+0.193466651 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:05:59 localhost podman[80884]: unhealthy Dec 6 04:05:59 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:05:59 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:05:59 localhost podman[80882]: 2025-12-06 09:05:59.694274498 +0000 UTC m=+0.220747723 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, managed_by=tripleo_ansible) Dec 6 04:05:59 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:06:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:06:03 localhost podman[80941]: 2025-12-06 09:06:03.557659333 +0000 UTC m=+0.085024425 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, container_name=iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:06:03 localhost podman[80941]: 2025-12-06 09:06:03.58997253 +0000 UTC m=+0.117337702 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, tcib_managed=true, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:06:03 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:06:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:06:07 localhost podman[80961]: 2025-12-06 09:06:07.556327553 +0000 UTC m=+0.081630990 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, config_id=tripleo_step5, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Dec 6 04:06:07 localhost podman[80961]: 2025-12-06 09:06:07.613381904 +0000 UTC m=+0.138685341 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute) Dec 6 04:06:07 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:06:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:06:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:06:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:06:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:06:24 localhost podman[80987]: 2025-12-06 09:06:24.578900173 +0000 UTC m=+0.101932018 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, container_name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:06:24 localhost podman[80987]: 2025-12-06 09:06:24.587105536 +0000 UTC m=+0.110137331 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-cron, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 04:06:24 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:06:24 localhost podman[80988]: 2025-12-06 09:06:24.680818097 +0000 UTC m=+0.203048746 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible) Dec 6 04:06:24 localhost podman[80990]: 2025-12-06 09:06:24.72788249 +0000 UTC m=+0.240949737 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, build-date=2025-11-18T22:49:46Z, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr) Dec 6 04:06:24 localhost podman[80988]: 2025-12-06 09:06:24.735001269 +0000 UTC m=+0.257231958 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com) Dec 6 04:06:24 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:06:24 localhost podman[80989]: 2025-12-06 09:06:24.643129105 +0000 UTC m=+0.160361341 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Dec 6 04:06:24 localhost podman[80989]: 2025-12-06 09:06:24.777271553 +0000 UTC m=+0.294503779 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:06:24 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:06:24 localhost podman[80990]: 2025-12-06 09:06:24.939339684 +0000 UTC m=+0.452406981 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, container_name=metrics_qdr, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:06:24 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:06:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:06:26 localhost systemd[1]: tmp-crun.ZtDXVh.mount: Deactivated successfully. Dec 6 04:06:26 localhost podman[81084]: 2025-12-06 09:06:26.562583665 +0000 UTC m=+0.089223294 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=) Dec 6 04:06:26 localhost podman[81084]: 2025-12-06 09:06:26.944377036 +0000 UTC m=+0.471016605 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_migration_target, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:06:26 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:06:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:06:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:06:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:06:30 localhost podman[81108]: 2025-12-06 09:06:30.5290077 +0000 UTC m=+0.061692374 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, batch=17.1_20251118.1) Dec 6 04:06:30 localhost systemd[1]: tmp-crun.BgfL1r.mount: Deactivated successfully. Dec 6 04:06:30 localhost podman[81109]: 2025-12-06 09:06:30.584512492 +0000 UTC m=+0.113047819 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ovn-controller-container) Dec 6 04:06:30 localhost podman[81109]: 2025-12-06 09:06:30.597166233 +0000 UTC m=+0.125701530 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:06:30 localhost podman[81109]: unhealthy Dec 6 04:06:30 localhost podman[81107]: 2025-12-06 09:06:30.554388303 +0000 UTC m=+0.085847410 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3) Dec 6 04:06:30 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:06:30 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:06:30 localhost podman[81108]: 2025-12-06 09:06:30.609296937 +0000 UTC m=+0.141981731 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Dec 6 04:06:30 localhost podman[81108]: unhealthy Dec 6 04:06:30 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:06:30 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:06:30 localhost podman[81107]: 2025-12-06 09:06:30.637357253 +0000 UTC m=+0.168816450 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:06:30 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:06:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:06:34 localhost podman[81168]: 2025-12-06 09:06:34.543407655 +0000 UTC m=+0.078407370 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, version=17.1.12, architecture=x86_64) Dec 6 04:06:34 localhost podman[81168]: 2025-12-06 09:06:34.55622096 +0000 UTC m=+0.091220715 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.12) Dec 6 04:06:34 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:06:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:06:38 localhost podman[81187]: 2025-12-06 09:06:38.543981914 +0000 UTC m=+0.080982160 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_compute, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1) Dec 6 04:06:38 localhost podman[81187]: 2025-12-06 09:06:38.59117519 +0000 UTC m=+0.128175396 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64) Dec 6 04:06:38 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:06:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:06:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:06:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:06:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:06:55 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:06:55 localhost recover_tripleo_nova_virtqemud[81240]: 51836 Dec 6 04:06:55 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:06:55 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:06:55 localhost podman[81216]: 2025-12-06 09:06:55.570920708 +0000 UTC m=+0.099370308 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 6 04:06:55 localhost podman[81216]: 2025-12-06 09:06:55.604905047 +0000 UTC m=+0.133354637 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64) Dec 6 04:06:55 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:06:55 localhost podman[81217]: 2025-12-06 09:06:55.622728816 +0000 UTC m=+0.147356357 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12) Dec 6 04:06:55 localhost podman[81217]: 2025-12-06 09:06:55.678501617 +0000 UTC m=+0.203129158 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12) Dec 6 04:06:55 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:06:55 localhost podman[81215]: 2025-12-06 09:06:55.67953407 +0000 UTC m=+0.210360473 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12) Dec 6 04:06:55 localhost podman[81215]: 2025-12-06 09:06:55.765315636 +0000 UTC m=+0.296141999 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, release=1761123044, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 6 04:06:55 localhost podman[81221]: 2025-12-06 09:06:55.776594974 +0000 UTC m=+0.296350766 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, release=1761123044, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:06:55 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:06:55 localhost podman[81221]: 2025-12-06 09:06:55.965651369 +0000 UTC m=+0.485407161 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Dec 6 04:06:55 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:06:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:06:57 localhost systemd[1]: tmp-crun.7OaoIz.mount: Deactivated successfully. Dec 6 04:06:57 localhost podman[81314]: 2025-12-06 09:06:57.566631251 +0000 UTC m=+0.091403221 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, release=1761123044, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc.) Dec 6 04:06:57 localhost podman[81314]: 2025-12-06 09:06:57.943721267 +0000 UTC m=+0.468493217 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, container_name=nova_migration_target, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Dec 6 04:06:57 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:07:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:07:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:07:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:07:01 localhost podman[81340]: 2025-12-06 09:07:01.555709926 +0000 UTC m=+0.083228279 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:07:01 localhost podman[81340]: 2025-12-06 09:07:01.568378348 +0000 UTC m=+0.095896731 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, container_name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:07:01 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:07:01 localhost podman[81342]: 2025-12-06 09:07:01.677313979 +0000 UTC m=+0.193762791 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Dec 6 04:07:01 localhost podman[81341]: 2025-12-06 09:07:01.683658025 +0000 UTC m=+0.205964787 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, architecture=x86_64) Dec 6 04:07:01 localhost podman[81341]: 2025-12-06 09:07:01.702112874 +0000 UTC m=+0.224419646 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:07:01 localhost podman[81341]: unhealthy Dec 6 04:07:01 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:07:01 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:07:01 localhost podman[81342]: 2025-12-06 09:07:01.715201908 +0000 UTC m=+0.231650730 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z) Dec 6 04:07:01 localhost podman[81342]: unhealthy Dec 6 04:07:01 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:07:01 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:07:02 localhost systemd[1]: tmp-crun.hDm8bf.mount: Deactivated successfully. Dec 6 04:07:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:07:05 localhost podman[81395]: 2025-12-06 09:07:05.557003757 +0000 UTC m=+0.085372625 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-iscsid, url=https://www.redhat.com) Dec 6 04:07:05 localhost podman[81395]: 2025-12-06 09:07:05.594200986 +0000 UTC m=+0.122569834 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=iscsid, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1) Dec 6 04:07:05 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:07:09 localhost podman[81415]: 2025-12-06 09:07:09.551059356 +0000 UTC m=+0.082400294 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 04:07:09 localhost podman[81415]: 2025-12-06 09:07:09.586163159 +0000 UTC m=+0.117504087 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:07:09 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:07:16 localhost sshd[81439]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:07:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:07:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:07:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:07:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:07:26 localhost systemd[1]: tmp-crun.81NYvK.mount: Deactivated successfully. Dec 6 04:07:26 localhost podman[81444]: 2025-12-06 09:07:26.568189018 +0000 UTC m=+0.091035031 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:07:26 localhost podman[81442]: 2025-12-06 09:07:26.546913591 +0000 UTC m=+0.078147133 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Dec 6 04:07:26 localhost podman[81443]: 2025-12-06 09:07:26.596650096 +0000 UTC m=+0.123667967 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, release=1761123044, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible) Dec 6 04:07:26 localhost podman[81444]: 2025-12-06 09:07:26.615927971 +0000 UTC m=+0.138773994 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:07:26 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:07:26 localhost podman[81442]: 2025-12-06 09:07:26.627669673 +0000 UTC m=+0.158903195 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z) Dec 6 04:07:26 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:07:26 localhost podman[81443]: 2025-12-06 09:07:26.644660647 +0000 UTC m=+0.171678538 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com) Dec 6 04:07:26 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:07:26 localhost podman[81450]: 2025-12-06 09:07:26.712551002 +0000 UTC m=+0.231216585 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:07:26 localhost podman[81450]: 2025-12-06 09:07:26.884208109 +0000 UTC m=+0.402873752 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1) Dec 6 04:07:26 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:07:27 localhost systemd[1]: tmp-crun.Ow2knC.mount: Deactivated successfully. Dec 6 04:07:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:07:28 localhost systemd[1]: tmp-crun.zBi81S.mount: Deactivated successfully. Dec 6 04:07:28 localhost podman[81541]: 2025-12-06 09:07:28.561223377 +0000 UTC m=+0.089792001 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1) Dec 6 04:07:28 localhost podman[81541]: 2025-12-06 09:07:28.924390894 +0000 UTC m=+0.452959568 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, container_name=nova_migration_target) Dec 6 04:07:28 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:07:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:07:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:07:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:07:32 localhost podman[81564]: 2025-12-06 09:07:32.56517595 +0000 UTC m=+0.092953109 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, version=17.1.12, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 6 04:07:32 localhost podman[81564]: 2025-12-06 09:07:32.575252342 +0000 UTC m=+0.103029491 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vcs-type=git, batch=17.1_20251118.1) Dec 6 04:07:32 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:07:32 localhost podman[81566]: 2025-12-06 09:07:32.615347419 +0000 UTC m=+0.133697406 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, container_name=ovn_controller, vcs-type=git, release=1761123044, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 6 04:07:32 localhost systemd[1]: tmp-crun.CecPZT.mount: Deactivated successfully. Dec 6 04:07:32 localhost podman[81565]: 2025-12-06 09:07:32.660774101 +0000 UTC m=+0.184203925 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 04:07:32 localhost podman[81565]: 2025-12-06 09:07:32.677312511 +0000 UTC m=+0.200742375 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12) Dec 6 04:07:32 localhost podman[81565]: unhealthy Dec 6 04:07:32 localhost podman[81566]: 2025-12-06 09:07:32.685413601 +0000 UTC m=+0.203763658 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T23:34:05Z, version=17.1.12, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:07:32 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:07:32 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:07:32 localhost podman[81566]: unhealthy Dec 6 04:07:32 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:07:32 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:07:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:07:36 localhost podman[81624]: 2025-12-06 09:07:36.54147808 +0000 UTC m=+0.077038009 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 04:07:36 localhost podman[81624]: 2025-12-06 09:07:36.549532119 +0000 UTC m=+0.085092008 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 6 04:07:36 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:07:40 localhost podman[81643]: 2025-12-06 09:07:40.541382238 +0000 UTC m=+0.077478723 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:07:40 localhost podman[81643]: 2025-12-06 09:07:40.568273867 +0000 UTC m=+0.104370332 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step5, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:07:40 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:07:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:07:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:07:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:07:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:07:57 localhost systemd[1]: tmp-crun.Hz2hTM.mount: Deactivated successfully. Dec 6 04:07:57 localhost podman[81669]: 2025-12-06 09:07:57.557059692 +0000 UTC m=+0.088109410 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:07:57 localhost podman[81669]: 2025-12-06 09:07:57.567070851 +0000 UTC m=+0.098120569 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:07:57 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:07:57 localhost podman[81676]: 2025-12-06 09:07:57.613819364 +0000 UTC m=+0.132656265 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:07:57 localhost podman[81677]: 2025-12-06 09:07:57.570382583 +0000 UTC m=+0.085734116 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:07:57 localhost podman[81670]: 2025-12-06 09:07:57.646056058 +0000 UTC m=+0.172494533 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc.) Dec 6 04:07:57 localhost podman[81676]: 2025-12-06 09:07:57.671196834 +0000 UTC m=+0.190033735 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:07:57 localhost podman[81670]: 2025-12-06 09:07:57.72291164 +0000 UTC m=+0.249350175 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:07:57 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:07:57 localhost podman[81677]: 2025-12-06 09:07:57.761123118 +0000 UTC m=+0.276474661 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1, batch=17.1_20251118.1) Dec 6 04:07:57 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:07:57 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:07:58 localhost systemd[1]: tmp-crun.Jrx2Uj.mount: Deactivated successfully. Dec 6 04:07:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:07:59 localhost podman[81772]: 2025-12-06 09:07:59.54223637 +0000 UTC m=+0.073406506 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:07:59 localhost podman[81772]: 2025-12-06 09:07:59.906747017 +0000 UTC m=+0.437917123 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com) Dec 6 04:07:59 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:08:03 localhost podman[81796]: 2025-12-06 09:08:03.553619501 +0000 UTC m=+0.086210561 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, architecture=x86_64, maintainer=OpenStack TripleO Team) Dec 6 04:08:03 localhost podman[81796]: 2025-12-06 09:08:03.567275153 +0000 UTC m=+0.099866213 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Dec 6 04:08:03 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:08:03 localhost podman[81798]: 2025-12-06 09:08:03.656040612 +0000 UTC m=+0.181748719 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ovn-controller-container) Dec 6 04:08:03 localhost podman[81798]: 2025-12-06 09:08:03.697597545 +0000 UTC m=+0.223305602 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, batch=17.1_20251118.1) Dec 6 04:08:03 localhost podman[81798]: unhealthy Dec 6 04:08:03 localhost podman[81797]: 2025-12-06 09:08:03.706289203 +0000 UTC m=+0.234613901 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 04:08:03 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:08:03 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:08:03 localhost podman[81797]: 2025-12-06 09:08:03.745605076 +0000 UTC m=+0.273929744 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 6 04:08:03 localhost podman[81797]: unhealthy Dec 6 04:08:03 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:08:03 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:08:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:08:07 localhost podman[81857]: 2025-12-06 09:08:07.542309143 +0000 UTC m=+0.076308105 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044) Dec 6 04:08:07 localhost podman[81857]: 2025-12-06 09:08:07.574223578 +0000 UTC m=+0.108222580 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 6 04:08:07 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:08:11 localhost podman[81876]: 2025-12-06 09:08:11.537251337 +0000 UTC m=+0.071998843 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 04:08:11 localhost podman[81876]: 2025-12-06 09:08:11.564354194 +0000 UTC m=+0.099101730 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:08:11 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:08:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:08:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:08:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:08:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:08:28 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:08:28 localhost recover_tripleo_nova_virtqemud[81921]: 51836 Dec 6 04:08:28 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:08:28 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:08:28 localhost systemd[1]: tmp-crun.MepAds.mount: Deactivated successfully. Dec 6 04:08:28 localhost podman[81900]: 2025-12-06 09:08:28.558412221 +0000 UTC m=+0.086947974 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vcs-type=git, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:08:28 localhost podman[81902]: 2025-12-06 09:08:28.616716941 +0000 UTC m=+0.138550097 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:08:28 localhost podman[81901]: 2025-12-06 09:08:28.663575817 +0000 UTC m=+0.188422516 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git) Dec 6 04:08:28 localhost podman[81902]: 2025-12-06 09:08:28.668166328 +0000 UTC m=+0.189999524 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_id=tripleo_step4) Dec 6 04:08:28 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:08:28 localhost podman[81904]: 2025-12-06 09:08:28.717435229 +0000 UTC m=+0.234295172 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:08:28 localhost podman[81901]: 2025-12-06 09:08:28.725790447 +0000 UTC m=+0.250637186 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 04:08:28 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:08:28 localhost podman[81900]: 2025-12-06 09:08:28.744653679 +0000 UTC m=+0.273189482 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:08:28 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:08:28 localhost podman[81904]: 2025-12-06 09:08:28.895436091 +0000 UTC m=+0.412296034 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:08:28 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:08:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:08:30 localhost podman[82005]: 2025-12-06 09:08:30.552307659 +0000 UTC m=+0.083256991 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64) Dec 6 04:08:30 localhost podman[82005]: 2025-12-06 09:08:30.912254995 +0000 UTC m=+0.443204317 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, container_name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 04:08:30 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:08:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:08:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:08:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:08:34 localhost podman[82030]: 2025-12-06 09:08:34.542195628 +0000 UTC m=+0.068679170 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=) Dec 6 04:08:34 localhost podman[82030]: 2025-12-06 09:08:34.55489584 +0000 UTC m=+0.081379392 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=, version=17.1.12, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public) Dec 6 04:08:34 localhost podman[82030]: unhealthy Dec 6 04:08:34 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:08:34 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:08:34 localhost podman[82028]: 2025-12-06 09:08:34.60027235 +0000 UTC m=+0.131105067 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, architecture=x86_64) Dec 6 04:08:34 localhost podman[82028]: 2025-12-06 09:08:34.612807466 +0000 UTC m=+0.143640133 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container) Dec 6 04:08:34 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:08:34 localhost podman[82029]: 2025-12-06 09:08:34.661297543 +0000 UTC m=+0.187932801 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public) Dec 6 04:08:34 localhost podman[82029]: 2025-12-06 09:08:34.677161272 +0000 UTC m=+0.203796520 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:08:34 localhost podman[82029]: unhealthy Dec 6 04:08:34 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:08:34 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:08:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:08:38 localhost podman[82088]: 2025-12-06 09:08:38.568925072 +0000 UTC m=+0.078483003 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Dec 6 04:08:38 localhost podman[82088]: 2025-12-06 09:08:38.605232672 +0000 UTC m=+0.114790563 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:08:38 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:08:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:08:42 localhost podman[82107]: 2025-12-06 09:08:42.570103309 +0000 UTC m=+0.084206350 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step5, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 6 04:08:42 localhost podman[82107]: 2025-12-06 09:08:42.59833164 +0000 UTC m=+0.112434651 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 6 04:08:42 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:08:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:08:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:08:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:08:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:08:59 localhost podman[82132]: 2025-12-06 09:08:59.576334772 +0000 UTC m=+0.103722932 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.buildah.version=1.41.4) Dec 6 04:08:59 localhost podman[82132]: 2025-12-06 09:08:59.585529876 +0000 UTC m=+0.112918056 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:08:59 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:08:59 localhost podman[82133]: 2025-12-06 09:08:59.670514758 +0000 UTC m=+0.190854300 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1) Dec 6 04:08:59 localhost podman[82133]: 2025-12-06 09:08:59.727445755 +0000 UTC m=+0.247785217 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=) Dec 6 04:08:59 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:08:59 localhost podman[82140]: 2025-12-06 09:08:59.728130637 +0000 UTC m=+0.244735974 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1) Dec 6 04:08:59 localhost podman[82134]: 2025-12-06 09:08:59.780795551 +0000 UTC m=+0.298654216 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Dec 6 04:08:59 localhost podman[82134]: 2025-12-06 09:08:59.864290528 +0000 UTC m=+0.382149223 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:08:59 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:08:59 localhost podman[82140]: 2025-12-06 09:08:59.926103365 +0000 UTC m=+0.442708682 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, container_name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 04:08:59 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:09:00 localhost systemd[1]: tmp-crun.wQ021n.mount: Deactivated successfully. Dec 6 04:09:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:09:01 localhost systemd[1]: tmp-crun.EOLoTX.mount: Deactivated successfully. Dec 6 04:09:01 localhost podman[82233]: 2025-12-06 09:09:01.537324634 +0000 UTC m=+0.071525528 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:09:01 localhost podman[82233]: 2025-12-06 09:09:01.903252526 +0000 UTC m=+0.437453410 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4) Dec 6 04:09:01 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:09:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:09:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:09:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:09:05 localhost systemd[1]: tmp-crun.JsAblI.mount: Deactivated successfully. Dec 6 04:09:05 localhost podman[82256]: 2025-12-06 09:09:05.539244995 +0000 UTC m=+0.073272842 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:09:05 localhost podman[82256]: 2025-12-06 09:09:05.546758877 +0000 UTC m=+0.080786704 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, container_name=collectd, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, name=rhosp17/openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Dec 6 04:09:05 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:09:05 localhost podman[82263]: 2025-12-06 09:09:05.608623016 +0000 UTC m=+0.130806438 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container) Dec 6 04:09:05 localhost podman[82257]: 2025-12-06 09:09:05.61915813 +0000 UTC m=+0.143752236 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:09:05 localhost podman[82257]: 2025-12-06 09:09:05.636151375 +0000 UTC m=+0.160745501 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:14:25Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1) Dec 6 04:09:05 localhost podman[82257]: unhealthy Dec 6 04:09:05 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:09:05 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:09:05 localhost podman[82263]: 2025-12-06 09:09:05.648152386 +0000 UTC m=+0.170335808 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:09:05 localhost podman[82263]: unhealthy Dec 6 04:09:05 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:09:05 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:09:06 localhost systemd[1]: tmp-crun.lBDGO6.mount: Deactivated successfully. Dec 6 04:09:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:09:09 localhost podman[82314]: 2025-12-06 09:09:09.544514068 +0000 UTC m=+0.076033948 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid) Dec 6 04:09:09 localhost podman[82314]: 2025-12-06 09:09:09.552522444 +0000 UTC m=+0.084042414 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 6 04:09:09 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:09:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:09:13 localhost systemd[1]: tmp-crun.hzS5Jk.mount: Deactivated successfully. Dec 6 04:09:13 localhost podman[82332]: 2025-12-06 09:09:13.549053259 +0000 UTC m=+0.085243981 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, tcib_managed=true, build-date=2025-11-19T00:36:58Z, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 04:09:13 localhost podman[82332]: 2025-12-06 09:09:13.576356422 +0000 UTC m=+0.112547144 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, config_id=tripleo_step5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20251118.1) Dec 6 04:09:13 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Deactivated successfully. Dec 6 04:09:16 localhost sshd[82358]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:09:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:09:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:09:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:09:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:09:30 localhost podman[82361]: 2025-12-06 09:09:30.560823694 +0000 UTC m=+0.090851745 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=logrotate_crond, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:09:30 localhost podman[82362]: 2025-12-06 09:09:30.605005446 +0000 UTC m=+0.132331764 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:09:30 localhost podman[82364]: 2025-12-06 09:09:30.62194 +0000 UTC m=+0.142333443 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:09:30 localhost podman[82363]: 2025-12-06 09:09:30.672016874 +0000 UTC m=+0.194738099 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:09:30 localhost podman[82362]: 2025-12-06 09:09:30.691101293 +0000 UTC m=+0.218427651 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, release=1761123044, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4) Dec 6 04:09:30 localhost podman[82361]: 2025-12-06 09:09:30.693455116 +0000 UTC m=+0.223483157 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, name=rhosp17/openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=) Dec 6 04:09:30 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:09:30 localhost podman[82363]: 2025-12-06 09:09:30.724573896 +0000 UTC m=+0.247295141 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:09:30 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:09:30 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:09:30 localhost podman[82364]: 2025-12-06 09:09:30.843158675 +0000 UTC m=+0.363552108 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com) Dec 6 04:09:30 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:09:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:09:32 localhost systemd[1]: tmp-crun.8rdkjB.mount: Deactivated successfully. Dec 6 04:09:32 localhost podman[82459]: 2025-12-06 09:09:32.537392366 +0000 UTC m=+0.069219507 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:09:32 localhost podman[82459]: 2025-12-06 09:09:32.943284391 +0000 UTC m=+0.475111572 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 6 04:09:32 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:09:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:09:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:09:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:09:36 localhost systemd[1]: tmp-crun.pOPsyf.mount: Deactivated successfully. Dec 6 04:09:36 localhost podman[82483]: 2025-12-06 09:09:36.569725614 +0000 UTC m=+0.093037292 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Dec 6 04:09:36 localhost systemd[1]: tmp-crun.RfQk5H.mount: Deactivated successfully. Dec 6 04:09:36 localhost podman[82483]: 2025-12-06 09:09:36.614989511 +0000 UTC m=+0.138301179 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1) Dec 6 04:09:36 localhost podman[82483]: unhealthy Dec 6 04:09:36 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:09:36 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:09:36 localhost podman[82482]: 2025-12-06 09:09:36.682914587 +0000 UTC m=+0.209259719 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:09:36 localhost podman[82482]: 2025-12-06 09:09:36.700264432 +0000 UTC m=+0.226609544 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Dec 6 04:09:36 localhost podman[82482]: unhealthy Dec 6 04:09:36 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:09:36 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:09:36 localhost podman[82481]: 2025-12-06 09:09:36.61722193 +0000 UTC m=+0.149912767 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, container_name=collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible) Dec 6 04:09:36 localhost podman[82481]: 2025-12-06 09:09:36.749358087 +0000 UTC m=+0.282048984 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12) Dec 6 04:09:36 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:09:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:09:40 localhost systemd[1]: tmp-crun.xRtJPr.mount: Deactivated successfully. Dec 6 04:09:40 localhost podman[82541]: 2025-12-06 09:09:40.552207504 +0000 UTC m=+0.088439810 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, release=1761123044) Dec 6 04:09:40 localhost podman[82541]: 2025-12-06 09:09:40.561118458 +0000 UTC m=+0.097350774 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:09:40 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:09:44 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:09:44 localhost recover_tripleo_nova_virtqemud[82567]: 51836 Dec 6 04:09:44 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:09:44 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:09:44 localhost podman[82560]: 2025-12-06 09:09:44.540300687 +0000 UTC m=+0.071662013 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Dec 6 04:09:44 localhost podman[82560]: 2025-12-06 09:09:44.589170794 +0000 UTC m=+0.120532180 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64) Dec 6 04:09:44 localhost podman[82560]: unhealthy Dec 6 04:09:44 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:09:44 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:10:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:10:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:10:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:10:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:10:01 localhost systemd[1]: tmp-crun.r8Bxrc.mount: Deactivated successfully. Dec 6 04:10:01 localhost podman[82586]: 2025-12-06 09:10:01.60610293 +0000 UTC m=+0.134916115 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Dec 6 04:10:01 localhost podman[82586]: 2025-12-06 09:10:01.634604959 +0000 UTC m=+0.163418134 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute) Dec 6 04:10:01 localhost podman[82584]: 2025-12-06 09:10:01.649579721 +0000 UTC m=+0.182439211 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, release=1761123044) Dec 6 04:10:01 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:10:01 localhost podman[82584]: 2025-12-06 09:10:01.657497125 +0000 UTC m=+0.190356555 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4) Dec 6 04:10:01 localhost podman[82585]: 2025-12-06 09:10:01.56983817 +0000 UTC m=+0.100301856 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 04:10:01 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:10:01 localhost podman[82590]: 2025-12-06 09:10:01.720740107 +0000 UTC m=+0.246192898 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, tcib_managed=true, managed_by=tripleo_ansible, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:10:01 localhost podman[82585]: 2025-12-06 09:10:01.750922698 +0000 UTC m=+0.281386334 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:10:01 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:10:01 localhost podman[82590]: 2025-12-06 09:10:01.926434524 +0000 UTC m=+0.451887335 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=) Dec 6 04:10:01 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:10:02 localhost systemd[1]: tmp-crun.q85z4T.mount: Deactivated successfully. Dec 6 04:10:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:10:03 localhost podman[82688]: 2025-12-06 09:10:03.547998242 +0000 UTC m=+0.083090575 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:36:58Z, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 04:10:03 localhost podman[82688]: 2025-12-06 09:10:03.922307643 +0000 UTC m=+0.457400056 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 6 04:10:03 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:10:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:10:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:10:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:10:07 localhost systemd[1]: tmp-crun.MNGekd.mount: Deactivated successfully. Dec 6 04:10:07 localhost podman[82711]: 2025-12-06 09:10:07.565146832 +0000 UTC m=+0.099387998 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:10:07 localhost podman[82713]: 2025-12-06 09:10:07.605756175 +0000 UTC m=+0.135047188 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, architecture=x86_64, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:10:07 localhost podman[82712]: 2025-12-06 09:10:07.650918319 +0000 UTC m=+0.180546302 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:10:07 localhost podman[82712]: 2025-12-06 09:10:07.661814745 +0000 UTC m=+0.191442768 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}) Dec 6 04:10:07 localhost podman[82712]: unhealthy Dec 6 04:10:07 localhost podman[82713]: 2025-12-06 09:10:07.674112955 +0000 UTC m=+0.203403968 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, container_name=ovn_controller) Dec 6 04:10:07 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:10:07 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:10:07 localhost podman[82713]: unhealthy Dec 6 04:10:07 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:10:07 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:10:07 localhost podman[82711]: 2025-12-06 09:10:07.728510213 +0000 UTC m=+0.262751409 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:10:07 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:10:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:10:11 localhost systemd[1]: tmp-crun.l6NkXt.mount: Deactivated successfully. Dec 6 04:10:11 localhost podman[82769]: 2025-12-06 09:10:11.5529907 +0000 UTC m=+0.083537578 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, version=17.1.12, build-date=2025-11-18T23:44:13Z, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 04:10:11 localhost podman[82769]: 2025-12-06 09:10:11.566158516 +0000 UTC m=+0.096705344 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Dec 6 04:10:11 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:10:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:10:15 localhost podman[82788]: 2025-12-06 09:10:15.549677979 +0000 UTC m=+0.084692184 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true) Dec 6 04:10:15 localhost podman[82788]: 2025-12-06 09:10:15.571159562 +0000 UTC m=+0.106173807 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z) Dec 6 04:10:15 localhost podman[82788]: unhealthy Dec 6 04:10:15 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:10:15 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:10:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34530 DF PROTO=TCP SPT=54512 DPT=9105 SEQ=2267039951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B59850000000001030307) Dec 6 04:10:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34531 DF PROTO=TCP SPT=54512 DPT=9105 SEQ=2267039951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B5D870000000001030307) Dec 6 04:10:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34532 DF PROTO=TCP SPT=54512 DPT=9105 SEQ=2267039951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B65880000000001030307) Dec 6 04:10:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34533 DF PROTO=TCP SPT=54512 DPT=9105 SEQ=2267039951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B75470000000001030307) Dec 6 04:10:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37968 DF PROTO=TCP SPT=39292 DPT=9882 SEQ=3455410207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B754E0000000001030307) Dec 6 04:10:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64677 DF PROTO=TCP SPT=59162 DPT=9102 SEQ=1834952502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B78090000000001030307) Dec 6 04:10:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37969 DF PROTO=TCP SPT=39292 DPT=9882 SEQ=3455410207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B79470000000001030307) Dec 6 04:10:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64678 DF PROTO=TCP SPT=59162 DPT=9102 SEQ=1834952502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B7C070000000001030307) Dec 6 04:10:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37970 DF PROTO=TCP SPT=39292 DPT=9882 SEQ=3455410207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B81480000000001030307) Dec 6 04:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64679 DF PROTO=TCP SPT=59162 DPT=9102 SEQ=1834952502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B84080000000001030307) Dec 6 04:10:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33515 DF PROTO=TCP SPT=36044 DPT=9101 SEQ=2010400817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B90570000000001030307) Dec 6 04:10:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37971 DF PROTO=TCP SPT=39292 DPT=9882 SEQ=3455410207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B91070000000001030307) Dec 6 04:10:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64680 DF PROTO=TCP SPT=59162 DPT=9102 SEQ=1834952502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B93C70000000001030307) Dec 6 04:10:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33516 DF PROTO=TCP SPT=36044 DPT=9101 SEQ=2010400817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B94470000000001030307) Dec 6 04:10:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34534 DF PROTO=TCP SPT=54512 DPT=9105 SEQ=2267039951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B95870000000001030307) Dec 6 04:10:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:10:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:10:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:10:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:10:32 localhost podman[82810]: 2025-12-06 09:10:32.600076886 +0000 UTC m=+0.088701609 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:10:32 localhost podman[82812]: 2025-12-06 09:10:32.662640726 +0000 UTC m=+0.145620054 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, distribution-scope=public, container_name=metrics_qdr, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 6 04:10:32 localhost systemd[1]: tmp-crun.CGHXPt.mount: Deactivated successfully. Dec 6 04:10:32 localhost podman[82809]: 2025-12-06 09:10:32.716265481 +0000 UTC m=+0.208447813 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 6 04:10:32 localhost podman[82809]: 2025-12-06 09:10:32.750144296 +0000 UTC m=+0.242326588 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Dec 6 04:10:32 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:10:32 localhost podman[82810]: 2025-12-06 09:10:32.77747021 +0000 UTC m=+0.266094913 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Dec 6 04:10:32 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:10:32 localhost podman[82811]: 2025-12-06 09:10:32.754336196 +0000 UTC m=+0.238581094 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container) Dec 6 04:10:32 localhost podman[82811]: 2025-12-06 09:10:32.839402661 +0000 UTC m=+0.323647539 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4) Dec 6 04:10:32 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:10:32 localhost podman[82812]: 2025-12-06 09:10:32.865342181 +0000 UTC m=+0.348321509 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.) Dec 6 04:10:32 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:10:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42375 DF PROTO=TCP SPT=43926 DPT=9100 SEQ=931924512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B9B9A0000000001030307) Dec 6 04:10:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33517 DF PROTO=TCP SPT=36044 DPT=9101 SEQ=2010400817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B9C480000000001030307) Dec 6 04:10:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42376 DF PROTO=TCP SPT=43926 DPT=9100 SEQ=931924512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9B9F870000000001030307) Dec 6 04:10:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:10:34 localhost podman[82908]: 2025-12-06 09:10:34.546827207 +0000 UTC m=+0.083337752 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible) Dec 6 04:10:34 localhost sshd[82929]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:10:34 localhost systemd-logind[760]: New session 23 of user zuul. Dec 6 04:10:34 localhost systemd[1]: Started Session 23 of User zuul. Dec 6 04:10:35 localhost podman[82908]: 2025-12-06 09:10:35.005500581 +0000 UTC m=+0.542011176 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, version=17.1.12, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4) Dec 6 04:10:35 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:10:35 localhost python3.9[83025]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:10:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42377 DF PROTO=TCP SPT=43926 DPT=9100 SEQ=931924512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BA7880000000001030307) Dec 6 04:10:36 localhost python3.9[83119]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:10:37 localhost python3.9[83212]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:10:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33518 DF PROTO=TCP SPT=36044 DPT=9101 SEQ=2010400817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BAC070000000001030307) Dec 6 04:10:37 localhost python3.9[83306]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:10:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:10:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:10:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:10:38 localhost podman[83370]: 2025-12-06 09:10:38.560158354 +0000 UTC m=+0.083501158 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., container_name=ovn_controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:10:38 localhost podman[83370]: 2025-12-06 09:10:38.601221341 +0000 UTC m=+0.124564145 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, release=1761123044, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible) Dec 6 04:10:38 localhost podman[83370]: unhealthy Dec 6 04:10:38 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:10:38 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:10:38 localhost systemd[1]: tmp-crun.G9Fkgc.mount: Deactivated successfully. Dec 6 04:10:38 localhost podman[83367]: 2025-12-06 09:10:38.625706416 +0000 UTC m=+0.148558995 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 04:10:38 localhost podman[83369]: 2025-12-06 09:10:38.670290491 +0000 UTC m=+0.194412829 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true) Dec 6 04:10:38 localhost podman[83369]: 2025-12-06 09:10:38.683547541 +0000 UTC m=+0.207669899 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:10:38 localhost podman[83369]: unhealthy Dec 6 04:10:38 localhost podman[83367]: 2025-12-06 09:10:38.693524919 +0000 UTC m=+0.216377518 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:10:38 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:10:38 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:10:38 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:10:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37972 DF PROTO=TCP SPT=39292 DPT=9882 SEQ=3455410207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BB1870000000001030307) Dec 6 04:10:38 localhost python3.9[83436]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64681 DF PROTO=TCP SPT=59162 DPT=9102 SEQ=1834952502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BB3870000000001030307) Dec 6 04:10:39 localhost python3.9[83545]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Dec 6 04:10:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42378 DF PROTO=TCP SPT=43926 DPT=9100 SEQ=931924512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BB7480000000001030307) Dec 6 04:10:41 localhost python3.9[83635]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:10:41 localhost python3.9[83727]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Dec 6 04:10:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:10:42 localhost systemd[1]: tmp-crun.ctgWOp.mount: Deactivated successfully. Dec 6 04:10:42 localhost podman[83805]: 2025-12-06 09:10:42.573937074 +0000 UTC m=+0.093474476 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com) Dec 6 04:10:42 localhost podman[83805]: 2025-12-06 09:10:42.586549283 +0000 UTC m=+0.106086725 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Dec 6 04:10:42 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:10:42 localhost python3.9[83825]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:10:43 localhost python3.9[83884]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:10:43 localhost systemd[1]: session-23.scope: Deactivated successfully. Dec 6 04:10:43 localhost systemd[1]: session-23.scope: Consumed 4.807s CPU time. Dec 6 04:10:43 localhost systemd-logind[760]: Session 23 logged out. Waiting for processes to exit. Dec 6 04:10:43 localhost systemd-logind[760]: Removed session 23. Dec 6 04:10:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33519 DF PROTO=TCP SPT=36044 DPT=9101 SEQ=2010400817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BCB880000000001030307) Dec 6 04:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:10:46 localhost systemd[1]: tmp-crun.7rVPFY.mount: Deactivated successfully. Dec 6 04:10:46 localhost podman[83900]: 2025-12-06 09:10:46.551573999 +0000 UTC m=+0.086769368 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:10:46 localhost podman[83900]: 2025-12-06 09:10:46.570123561 +0000 UTC m=+0.105318850 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, version=17.1.12) Dec 6 04:10:46 localhost podman[83900]: unhealthy Dec 6 04:10:46 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:10:46 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:10:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41760 DF PROTO=TCP SPT=56810 DPT=9105 SEQ=484177827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BD2C70000000001030307) Dec 6 04:10:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41761 DF PROTO=TCP SPT=56810 DPT=9105 SEQ=484177827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BDAC70000000001030307) Dec 6 04:10:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21256 DF PROTO=TCP SPT=54202 DPT=9882 SEQ=2009865825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BEA7E0000000001030307) Dec 6 04:10:53 localhost sshd[83922]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:10:53 localhost systemd-logind[760]: New session 24 of user zuul. Dec 6 04:10:53 localhost systemd[1]: Started Session 24 of User zuul. Dec 6 04:10:54 localhost python3.9[84017]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:10:54 localhost systemd[1]: Reloading. Dec 6 04:10:54 localhost systemd-rc-local-generator[84038]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:10:54 localhost systemd-sysv-generator[84042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:10:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:10:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64682 DF PROTO=TCP SPT=59162 DPT=9102 SEQ=1834952502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9BF3870000000001030307) Dec 6 04:10:55 localhost python3.9[84142]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:10:55 localhost network[84159]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:10:55 localhost network[84160]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:10:55 localhost network[84161]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:10:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:10:59 localhost python3.9[84358]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:10:59 localhost network[84375]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:10:59 localhost network[84376]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:10:59 localhost network[84377]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:11:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23224 DF PROTO=TCP SPT=35758 DPT=9101 SEQ=3085161116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C05880000000001030307) Dec 6 04:11:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:11:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23225 DF PROTO=TCP SPT=35758 DPT=9101 SEQ=3085161116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C09870000000001030307) Dec 6 04:11:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:11:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:11:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:11:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:11:03 localhost podman[84500]: 2025-12-06 09:11:03.555787522 +0000 UTC m=+0.084774767 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=) Dec 6 04:11:03 localhost podman[84500]: 2025-12-06 09:11:03.589922175 +0000 UTC m=+0.118909450 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, release=1761123044, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public) Dec 6 04:11:03 localhost systemd[1]: tmp-crun.Bjb9Xb.mount: Deactivated successfully. Dec 6 04:11:03 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:11:03 localhost podman[84499]: 2025-12-06 09:11:03.616375051 +0000 UTC m=+0.146881983 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:11:03 localhost podman[84499]: 2025-12-06 09:11:03.653792526 +0000 UTC m=+0.184299478 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., version=17.1.12, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:11:03 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:11:03 localhost podman[84502]: 2025-12-06 09:11:03.662641999 +0000 UTC m=+0.185937689 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com) Dec 6 04:11:03 localhost podman[84501]: 2025-12-06 09:11:03.719710839 +0000 UTC m=+0.245068242 container health_status b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 6 04:11:03 localhost podman[84501]: 2025-12-06 09:11:03.77937122 +0000 UTC m=+0.304728633 container exec_died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:11:03 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Deactivated successfully. Dec 6 04:11:03 localhost podman[84502]: 2025-12-06 09:11:03.864219848 +0000 UTC m=+0.387515498 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, release=1761123044) Dec 6 04:11:03 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:11:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42380 DF PROTO=TCP SPT=43926 DPT=9100 SEQ=931924512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C17870000000001030307) Dec 6 04:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:11:05 localhost podman[84677]: 2025-12-06 09:11:05.350886087 +0000 UTC m=+0.073455177 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git) Dec 6 04:11:05 localhost python3.9[84678]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:11:05 localhost podman[84677]: 2025-12-06 09:11:05.729290824 +0000 UTC m=+0.451859944 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 6 04:11:05 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:11:06 localhost systemd[1]: Reloading. Dec 6 04:11:06 localhost systemd-sysv-generator[84730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:11:06 localhost systemd-rc-local-generator[84723]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:11:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:11:06 localhost systemd[1]: Stopping ceilometer_agent_compute container... Dec 6 04:11:07 localhost systemd[1]: tmp-crun.guPICo.mount: Deactivated successfully. Dec 6 04:11:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23227 DF PROTO=TCP SPT=35758 DPT=9101 SEQ=3085161116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C21470000000001030307) Dec 6 04:11:09 localhost sshd[84754]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:11:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:11:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:11:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:11:09 localhost systemd[1]: tmp-crun.PbYtpf.mount: Deactivated successfully. Dec 6 04:11:09 localhost podman[84757]: 2025-12-06 09:11:09.322988311 +0000 UTC m=+0.094902219 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 6 04:11:09 localhost podman[84757]: 2025-12-06 09:11:09.366401251 +0000 UTC m=+0.138315179 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc.) Dec 6 04:11:09 localhost podman[84757]: unhealthy Dec 6 04:11:09 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:11:09 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:11:09 localhost podman[84758]: 2025-12-06 09:11:09.371829969 +0000 UTC m=+0.141344663 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:11:09 localhost podman[84758]: 2025-12-06 09:11:09.452155367 +0000 UTC m=+0.221670071 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Dec 6 04:11:09 localhost podman[84758]: unhealthy Dec 6 04:11:09 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:11:09 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:11:09 localhost podman[84756]: 2025-12-06 09:11:09.512333264 +0000 UTC m=+0.287795891 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, container_name=collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, version=17.1.12, io.openshift.expose-services=) Dec 6 04:11:09 localhost podman[84756]: 2025-12-06 09:11:09.525223592 +0000 UTC m=+0.300686219 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 04:11:09 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:11:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64010 DF PROTO=TCP SPT=40764 DPT=9100 SEQ=1551701062 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C2C870000000001030307) Dec 6 04:11:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:11:13 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:11:13 localhost podman[84816]: 2025-12-06 09:11:13.061760577 +0000 UTC m=+0.092325129 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid) Dec 6 04:11:13 localhost podman[84816]: 2025-12-06 09:11:13.091253187 +0000 UTC m=+0.121817739 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, distribution-scope=public, release=1761123044, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:11:13 localhost recover_tripleo_nova_virtqemud[84835]: 51836 Dec 6 04:11:13 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:11:13 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:11:13 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:11:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23228 DF PROTO=TCP SPT=35758 DPT=9101 SEQ=3085161116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C41870000000001030307) Dec 6 04:11:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32821 DF PROTO=TCP SPT=47404 DPT=9105 SEQ=2694928483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C43E80000000001030307) Dec 6 04:11:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:11:16 localhost podman[84836]: 2025-12-06 09:11:16.803219026 +0000 UTC m=+0.087552393 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 04:11:16 localhost podman[84836]: 2025-12-06 09:11:16.854275131 +0000 UTC m=+0.138608458 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute) Dec 6 04:11:16 localhost podman[84836]: unhealthy Dec 6 04:11:16 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:11:16 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:11:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32823 DF PROTO=TCP SPT=47404 DPT=9105 SEQ=2694928483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C50070000000001030307) Dec 6 04:11:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61283 DF PROTO=TCP SPT=48318 DPT=9882 SEQ=111931538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C5FAF0000000001030307) Dec 6 04:11:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21261 DF PROTO=TCP SPT=54202 DPT=9882 SEQ=2009865825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C67870000000001030307) Dec 6 04:11:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47426 DF PROTO=TCP SPT=43724 DPT=9101 SEQ=3897077866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C7AB70000000001030307) Dec 6 04:11:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47427 DF PROTO=TCP SPT=43724 DPT=9101 SEQ=3897077866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C7EC80000000001030307) Dec 6 04:11:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:11:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:11:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:11:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:11:34 localhost systemd[1]: tmp-crun.Xdu4qr.mount: Deactivated successfully. Dec 6 04:11:34 localhost podman[84858]: 2025-12-06 09:11:34.06931127 +0000 UTC m=+0.100090280 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 6 04:11:34 localhost podman[84858]: 2025-12-06 09:11:34.105483676 +0000 UTC m=+0.136262696 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Dec 6 04:11:34 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:11:34 localhost podman[84859]: 2025-12-06 09:11:34.122095628 +0000 UTC m=+0.149409921 container health_status 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, distribution-scope=public) Dec 6 04:11:34 localhost podman[84860]: Error: container b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d is not running Dec 6 04:11:34 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Main process exited, code=exited, status=125/n/a Dec 6 04:11:34 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Failed with result 'exit-code'. Dec 6 04:11:34 localhost podman[84859]: 2025-12-06 09:11:34.147138801 +0000 UTC m=+0.174453164 container exec_died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 6 04:11:34 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Deactivated successfully. Dec 6 04:11:34 localhost podman[84863]: 2025-12-06 09:11:34.229056238 +0000 UTC m=+0.250758728 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, release=1761123044, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}) Dec 6 04:11:34 localhost podman[84863]: 2025-12-06 09:11:34.415738218 +0000 UTC m=+0.437440678 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=metrics_qdr, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:11:34 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:11:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64012 DF PROTO=TCP SPT=40764 DPT=9100 SEQ=1551701062 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C8D870000000001030307) Dec 6 04:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:11:36 localhost podman[84941]: 2025-12-06 09:11:36.348030946 +0000 UTC m=+0.085890861 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:11:36 localhost podman[84941]: 2025-12-06 09:11:36.725904185 +0000 UTC m=+0.463764130 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target) Dec 6 04:11:36 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:11:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47429 DF PROTO=TCP SPT=43724 DPT=9101 SEQ=3897077866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9C96870000000001030307) Dec 6 04:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:11:39 localhost systemd[1]: tmp-crun.S3KjXi.mount: Deactivated successfully. Dec 6 04:11:39 localhost podman[84964]: 2025-12-06 09:11:39.548214495 +0000 UTC m=+0.083849989 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Dec 6 04:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:11:39 localhost podman[84964]: 2025-12-06 09:11:39.570141811 +0000 UTC m=+0.105777315 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:11:39 localhost podman[84964]: unhealthy Dec 6 04:11:39 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:11:39 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:11:39 localhost systemd[1]: tmp-crun.XlJjan.mount: Deactivated successfully. Dec 6 04:11:39 localhost podman[84983]: 2025-12-06 09:11:39.670046463 +0000 UTC m=+0.098554932 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, container_name=collectd) Dec 6 04:11:39 localhost podman[84986]: 2025-12-06 09:11:39.711358958 +0000 UTC m=+0.135852563 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:11:39 localhost podman[84986]: 2025-12-06 09:11:39.723060719 +0000 UTC m=+0.147554364 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:11:39 localhost podman[84986]: unhealthy Dec 6 04:11:39 localhost podman[84983]: 2025-12-06 09:11:39.73281652 +0000 UTC m=+0.161324979 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd) Dec 6 04:11:39 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:11:39 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:11:39 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:11:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52701 DF PROTO=TCP SPT=54582 DPT=9100 SEQ=716485153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9CA1C70000000001030307) Dec 6 04:11:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:11:43 localhost podman[85027]: 2025-12-06 09:11:43.298932199 +0000 UTC m=+0.084645244 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container) Dec 6 04:11:43 localhost podman[85027]: 2025-12-06 09:11:43.339314444 +0000 UTC m=+0.125027509 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.buildah.version=1.41.4, container_name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 6 04:11:43 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:11:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47430 DF PROTO=TCP SPT=43724 DPT=9101 SEQ=3897077866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9CB7870000000001030307) Dec 6 04:11:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:11:47 localhost podman[85046]: 2025-12-06 09:11:47.047092294 +0000 UTC m=+0.082451616 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, container_name=nova_compute) Dec 6 04:11:47 localhost podman[85046]: 2025-12-06 09:11:47.068147983 +0000 UTC m=+0.103507235 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 6 04:11:47 localhost podman[85046]: unhealthy Dec 6 04:11:47 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:11:47 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:11:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13568 DF PROTO=TCP SPT=41692 DPT=9105 SEQ=3332251194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9CBD070000000001030307) Dec 6 04:11:49 localhost podman[84740]: time="2025-12-06T09:11:49Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Dec 6 04:11:49 localhost systemd[1]: libpod-b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.scope: Deactivated successfully. Dec 6 04:11:49 localhost systemd[1]: libpod-b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.scope: Consumed 6.388s CPU time. Dec 6 04:11:49 localhost podman[84740]: 2025-12-06 09:11:49.079174641 +0000 UTC m=+42.087090517 container died b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.) Dec 6 04:11:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.timer: Deactivated successfully. Dec 6 04:11:49 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d. Dec 6 04:11:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Failed to open /run/systemd/transient/b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: No such file or directory Dec 6 04:11:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d-userdata-shm.mount: Deactivated successfully. Dec 6 04:11:49 localhost systemd[1]: var-lib-containers-storage-overlay-2961fd61607b985660b4106c2f39c6dd4b09a3526a1728e9022e8b160d048172-merged.mount: Deactivated successfully. Dec 6 04:11:49 localhost podman[84740]: 2025-12-06 09:11:49.140148363 +0000 UTC m=+42.148064219 container cleanup b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 6 04:11:49 localhost podman[84740]: ceilometer_agent_compute Dec 6 04:11:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.timer: Failed to open /run/systemd/transient/b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.timer: No such file or directory Dec 6 04:11:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Failed to open /run/systemd/transient/b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: No such file or directory Dec 6 04:11:49 localhost podman[85068]: 2025-12-06 09:11:49.179065033 +0000 UTC m=+0.090463613 container cleanup b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:11:49 localhost systemd[1]: libpod-conmon-b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.scope: Deactivated successfully. Dec 6 04:11:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13569 DF PROTO=TCP SPT=41692 DPT=9105 SEQ=3332251194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9CC5080000000001030307) Dec 6 04:11:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.timer: Failed to open /run/systemd/transient/b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.timer: No such file or directory Dec 6 04:11:49 localhost systemd[1]: b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: Failed to open /run/systemd/transient/b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d.service: No such file or directory Dec 6 04:11:49 localhost podman[85083]: 2025-12-06 09:11:49.284574088 +0000 UTC m=+0.074219940 container cleanup b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Dec 6 04:11:49 localhost podman[85083]: ceilometer_agent_compute Dec 6 04:11:49 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Dec 6 04:11:49 localhost systemd[1]: Stopped ceilometer_agent_compute container. Dec 6 04:11:50 localhost python3.9[85185]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:11:51 localhost systemd[1]: Reloading. Dec 6 04:11:51 localhost systemd-rc-local-generator[85210]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:11:51 localhost systemd-sysv-generator[85215]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:11:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:11:51 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Dec 6 04:11:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f3:a2:ab MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=46734 SEQ=1946256577 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Dec 6 04:11:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f3:a2:ab MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=46734 SEQ=1946256577 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Dec 6 04:12:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9517 DF PROTO=TCP SPT=39018 DPT=9101 SEQ=138563747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9CEFE80000000001030307) Dec 6 04:12:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9518 DF PROTO=TCP SPT=39018 DPT=9101 SEQ=138563747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9CF4070000000001030307) Dec 6 04:12:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23230 DF PROTO=TCP SPT=35758 DPT=9101 SEQ=3085161116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9CFF870000000001030307) Dec 6 04:12:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:12:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:12:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:12:04 localhost podman[85242]: 2025-12-06 09:12:04.566014739 +0000 UTC m=+0.090633687 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, version=17.1.12, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Dec 6 04:12:04 localhost podman[85241]: Error: container 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d is not running Dec 6 04:12:04 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Main process exited, code=exited, status=125/n/a Dec 6 04:12:04 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Failed with result 'exit-code'. Dec 6 04:12:04 localhost podman[85240]: 2025-12-06 09:12:04.663613861 +0000 UTC m=+0.194574004 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com) Dec 6 04:12:04 localhost podman[85240]: 2025-12-06 09:12:04.679224163 +0000 UTC m=+0.210184246 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, name=rhosp17/openstack-cron) Dec 6 04:12:04 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:12:04 localhost podman[85242]: 2025-12-06 09:12:04.782246601 +0000 UTC m=+0.306865559 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 6 04:12:04 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:12:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64013 DF PROTO=TCP SPT=40764 DPT=9100 SEQ=1551701062 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D0B870000000001030307) Dec 6 04:12:07 localhost systemd[1]: tmp-crun.TVWL8t.mount: Deactivated successfully. Dec 6 04:12:07 localhost podman[85303]: 2025-12-06 09:12:07.299423505 +0000 UTC m=+0.083311732 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4) Dec 6 04:12:07 localhost podman[85303]: 2025-12-06 09:12:07.670493363 +0000 UTC m=+0.454381570 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, distribution-scope=public) Dec 6 04:12:07 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:12:10 localhost podman[85327]: 2025-12-06 09:12:10.077599102 +0000 UTC m=+0.094330231 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 6 04:12:10 localhost systemd[1]: tmp-crun.qKN3RK.mount: Deactivated successfully. Dec 6 04:12:10 localhost podman[85326]: 2025-12-06 09:12:10.120465795 +0000 UTC m=+0.144282424 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4) Dec 6 04:12:10 localhost podman[85327]: 2025-12-06 09:12:10.173356737 +0000 UTC m=+0.190087826 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, maintainer=OpenStack TripleO Team) Dec 6 04:12:10 localhost podman[85327]: unhealthy Dec 6 04:12:10 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:12:10 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:12:10 localhost podman[85325]: 2025-12-06 09:12:10.187092621 +0000 UTC m=+0.215268683 container health_status 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z) Dec 6 04:12:10 localhost podman[85325]: 2025-12-06 09:12:10.200319038 +0000 UTC m=+0.228495150 container exec_died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:12:10 localhost podman[85326]: 2025-12-06 09:12:10.214685872 +0000 UTC m=+0.238502491 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Dec 6 04:12:10 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Deactivated successfully. Dec 6 04:12:10 localhost podman[85326]: unhealthy Dec 6 04:12:10 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:12:10 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:12:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25483 DF PROTO=TCP SPT=34554 DPT=9100 SEQ=3926837129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D17070000000001030307) Dec 6 04:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:12:13 localhost podman[85382]: 2025-12-06 09:12:13.543454117 +0000 UTC m=+0.078382220 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1) Dec 6 04:12:13 localhost podman[85382]: 2025-12-06 09:12:13.557120358 +0000 UTC m=+0.092048491 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, architecture=x86_64, config_id=tripleo_step3, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git) Dec 6 04:12:13 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:12:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9521 DF PROTO=TCP SPT=39018 DPT=9101 SEQ=138563747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D2B870000000001030307) Dec 6 04:12:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52151 DF PROTO=TCP SPT=42290 DPT=9105 SEQ=3723506483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D32470000000001030307) Dec 6 04:12:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:12:17 localhost podman[85401]: 2025-12-06 09:12:17.301782575 +0000 UTC m=+0.082589229 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Dec 6 04:12:17 localhost podman[85401]: 2025-12-06 09:12:17.322101072 +0000 UTC m=+0.102907706 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, release=1761123044, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:12:17 localhost podman[85401]: unhealthy Dec 6 04:12:17 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:12:17 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:12:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52152 DF PROTO=TCP SPT=42290 DPT=9105 SEQ=3723506483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D3A470000000001030307) Dec 6 04:12:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52153 DF PROTO=TCP SPT=42290 DPT=9105 SEQ=3723506483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D4A070000000001030307) Dec 6 04:12:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21306 DF PROTO=TCP SPT=47552 DPT=9882 SEQ=728032000 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D51870000000001030307) Dec 6 04:12:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55460 DF PROTO=TCP SPT=47552 DPT=9102 SEQ=4027190037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D5D870000000001030307) Dec 6 04:12:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50506 DF PROTO=TCP SPT=60512 DPT=9101 SEQ=4031597247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D69070000000001030307) Dec 6 04:12:33 localhost podman[85225]: time="2025-12-06T09:12:33Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Dec 6 04:12:33 localhost systemd[1]: libpod-4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.scope: Deactivated successfully. Dec 6 04:12:33 localhost systemd[1]: libpod-4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.scope: Consumed 6.274s CPU time. Dec 6 04:12:33 localhost podman[85225]: 2025-12-06 09:12:33.515278572 +0000 UTC m=+42.095086582 container died 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Dec 6 04:12:33 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.timer: Deactivated successfully. Dec 6 04:12:33 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d. Dec 6 04:12:33 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Failed to open /run/systemd/transient/4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: No such file or directory Dec 6 04:12:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d-userdata-shm.mount: Deactivated successfully. Dec 6 04:12:33 localhost podman[85225]: 2025-12-06 09:12:33.565535892 +0000 UTC m=+42.145343852 container cleanup 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z) Dec 6 04:12:33 localhost podman[85225]: ceilometer_agent_ipmi Dec 6 04:12:33 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.timer: Failed to open /run/systemd/transient/4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.timer: No such file or directory Dec 6 04:12:33 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Failed to open /run/systemd/transient/4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: No such file or directory Dec 6 04:12:33 localhost podman[85423]: 2025-12-06 09:12:33.628159214 +0000 UTC m=+0.105847326 container cleanup 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 6 04:12:33 localhost systemd[1]: libpod-conmon-4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.scope: Deactivated successfully. Dec 6 04:12:33 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.timer: Failed to open /run/systemd/transient/4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.timer: No such file or directory Dec 6 04:12:33 localhost systemd[1]: 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: Failed to open /run/systemd/transient/4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d.service: No such file or directory Dec 6 04:12:33 localhost podman[85437]: 2025-12-06 09:12:33.735722603 +0000 UTC m=+0.073521850 container cleanup 4a716413d206bb49737d54cff67f2a7c4382133331c5307f65ba5d0a44df8a5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64) Dec 6 04:12:33 localhost podman[85437]: ceilometer_agent_ipmi Dec 6 04:12:33 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Dec 6 04:12:33 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Dec 6 04:12:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f3:a2:ab MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=46744 SEQ=55634375 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Dec 6 04:12:34 localhost python3.9[85542]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:12:34 localhost systemd[1]: var-lib-containers-storage-overlay-979b7277537d5ac546debf05b8fcf4666622be00eb21163fa54e2b404edc7fc7-merged.mount: Deactivated successfully. Dec 6 04:12:34 localhost systemd[1]: Reloading. Dec 6 04:12:34 localhost systemd-rc-local-generator[85562]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:12:34 localhost systemd-sysv-generator[85565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:12:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:12:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:12:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:12:34 localhost systemd[1]: Stopping collectd container... Dec 6 04:12:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:12:34 localhost recover_tripleo_nova_virtqemud[85596]: 51836 Dec 6 04:12:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:12:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:12:34 localhost systemd[1]: tmp-crun.5moTby.mount: Deactivated successfully. Dec 6 04:12:34 localhost podman[85583]: 2025-12-06 09:12:34.939919598 +0000 UTC m=+0.064417849 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}) Dec 6 04:12:34 localhost podman[85582]: 2025-12-06 09:12:34.962713981 +0000 UTC m=+0.085104378 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:49:32Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Dec 6 04:12:35 localhost podman[85582]: 2025-12-06 09:12:35.053748329 +0000 UTC m=+0.176138716 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc.) Dec 6 04:12:35 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:12:35 localhost podman[85583]: 2025-12-06 09:12:35.107284741 +0000 UTC m=+0.231782992 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, container_name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, release=1761123044) Dec 6 04:12:35 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:12:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50508 DF PROTO=TCP SPT=60512 DPT=9101 SEQ=4031597247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D80C70000000001030307) Dec 6 04:12:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:12:38 localhost systemd[1]: tmp-crun.Uubuv4.mount: Deactivated successfully. Dec 6 04:12:38 localhost podman[85641]: 2025-12-06 09:12:38.059309082 +0000 UTC m=+0.091109252 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 6 04:12:38 localhost podman[85641]: 2025-12-06 09:12:38.424392926 +0000 UTC m=+0.456193066 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z) Dec 6 04:12:38 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:12:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2050 DF PROTO=TCP SPT=57680 DPT=9100 SEQ=646320167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9D8C070000000001030307) Dec 6 04:12:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:12:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:12:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:12:40 localhost systemd[1]: tmp-crun.Pez6Ei.mount: Deactivated successfully. Dec 6 04:12:40 localhost podman[85665]: 2025-12-06 09:12:40.315119352 +0000 UTC m=+0.101666917 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, version=17.1.12, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, vcs-type=git) Dec 6 04:12:40 localhost podman[85665]: 2025-12-06 09:12:40.326835354 +0000 UTC m=+0.113382909 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team) Dec 6 04:12:40 localhost podman[85665]: unhealthy Dec 6 04:12:40 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:12:40 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:12:40 localhost podman[85679]: Error: container 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea is not running Dec 6 04:12:40 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Main process exited, code=exited, status=125/n/a Dec 6 04:12:40 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Failed with result 'exit-code'. Dec 6 04:12:40 localhost podman[85682]: 2025-12-06 09:12:40.462653354 +0000 UTC m=+0.139273418 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, architecture=x86_64, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:12:40 localhost podman[85682]: 2025-12-06 09:12:40.503268818 +0000 UTC m=+0.179888822 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:12:40 localhost podman[85682]: unhealthy Dec 6 04:12:40 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:12:40 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:12:44 localhost systemd[1]: tmp-crun.HnoHKb.mount: Deactivated successfully. Dec 6 04:12:44 localhost podman[85717]: 2025-12-06 09:12:44.019334951 +0000 UTC m=+0.056987980 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Dec 6 04:12:44 localhost podman[85717]: 2025-12-06 09:12:44.056203518 +0000 UTC m=+0.093856547 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 6 04:12:44 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:12:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50509 DF PROTO=TCP SPT=60512 DPT=9101 SEQ=4031597247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DA1870000000001030307) Dec 6 04:12:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f3:a2:ab MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=46744 SEQ=55634375 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Dec 6 04:12:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:12:47 localhost systemd[1]: tmp-crun.T5sbdK.mount: Deactivated successfully. Dec 6 04:12:47 localhost podman[85737]: 2025-12-06 09:12:47.546732025 +0000 UTC m=+0.079980109 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:12:47 localhost podman[85737]: 2025-12-06 09:12:47.567342371 +0000 UTC m=+0.100590455 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step5) Dec 6 04:12:47 localhost podman[85737]: unhealthy Dec 6 04:12:47 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:12:47 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:12:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4907 DF PROTO=TCP SPT=47702 DPT=9105 SEQ=3958864783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DAF870000000001030307) Dec 6 04:12:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20499 DF PROTO=TCP SPT=36786 DPT=9882 SEQ=698562606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DBF3E0000000001030307) Dec 6 04:12:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63340 DF PROTO=TCP SPT=51568 DPT=9102 SEQ=459381356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DC9870000000001030307) Dec 6 04:13:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27472 DF PROTO=TCP SPT=38688 DPT=9101 SEQ=1907453163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DDA470000000001030307) Dec 6 04:13:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27473 DF PROTO=TCP SPT=38688 DPT=9101 SEQ=1907453163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DDE470000000001030307) Dec 6 04:13:01 localhost sshd[85760]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:13:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2052 DF PROTO=TCP SPT=57680 DPT=9100 SEQ=646320167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DEB880000000001030307) Dec 6 04:13:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:13:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:13:05 localhost podman[85762]: 2025-12-06 09:13:05.307580048 +0000 UTC m=+0.092028301 container health_status 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, release=1761123044) Dec 6 04:13:05 localhost podman[85762]: 2025-12-06 09:13:05.341524415 +0000 UTC m=+0.125972608 container exec_died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, distribution-scope=public, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 6 04:13:05 localhost podman[85763]: 2025-12-06 09:13:05.34525208 +0000 UTC m=+0.126317378 container health_status f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044) Dec 6 04:13:05 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Deactivated successfully. Dec 6 04:13:05 localhost podman[85763]: 2025-12-06 09:13:05.524018066 +0000 UTC m=+0.305083344 container exec_died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=metrics_qdr, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 6 04:13:05 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Deactivated successfully. Dec 6 04:13:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25486 DF PROTO=TCP SPT=34554 DPT=9100 SEQ=3926837129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9DF5870000000001030307) Dec 6 04:13:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:13:08 localhost podman[85811]: 2025-12-06 09:13:08.541645011 +0000 UTC m=+0.074883282 container health_status d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Dec 6 04:13:08 localhost podman[85811]: 2025-12-06 09:13:08.892006341 +0000 UTC m=+0.425244542 container exec_died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 6 04:13:08 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Deactivated successfully. Dec 6 04:13:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47962 DF PROTO=TCP SPT=37604 DPT=9100 SEQ=3870613639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E01470000000001030307) Dec 6 04:13:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:13:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:13:10 localhost podman[85834]: Error: container 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea is not running Dec 6 04:13:10 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Main process exited, code=exited, status=125/n/a Dec 6 04:13:10 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Failed with result 'exit-code'. Dec 6 04:13:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:13:10 localhost podman[85856]: 2025-12-06 09:13:10.620520792 +0000 UTC m=+0.069372501 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com) Dec 6 04:13:10 localhost podman[85835]: 2025-12-06 09:13:10.598991608 +0000 UTC m=+0.126358819 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, version=17.1.12, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=) Dec 6 04:13:10 localhost podman[85856]: 2025-12-06 09:13:10.657823104 +0000 UTC m=+0.106674753 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:13:10 localhost podman[85856]: unhealthy Dec 6 04:13:10 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:13:10 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:13:10 localhost podman[85835]: 2025-12-06 09:13:10.679045598 +0000 UTC m=+0.206412809 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4) Dec 6 04:13:10 localhost podman[85835]: unhealthy Dec 6 04:13:10 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:13:10 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:13:14 localhost systemd[1]: tmp-crun.ppFHfR.mount: Deactivated successfully. Dec 6 04:13:14 localhost podman[85886]: 2025-12-06 09:13:14.296083707 +0000 UTC m=+0.079670959 container health_status 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 6 04:13:14 localhost podman[85886]: 2025-12-06 09:13:14.305291102 +0000 UTC m=+0.088878314 container exec_died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:13:14 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Deactivated successfully. Dec 6 04:13:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27476 DF PROTO=TCP SPT=38688 DPT=9101 SEQ=1907453163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E15870000000001030307) Dec 6 04:13:17 localhost podman[85584]: time="2025-12-06T09:13:17Z" level=warning msg="StopSignal SIGTERM failed to stop container collectd in 42 seconds, resorting to SIGKILL" Dec 6 04:13:17 localhost podman[85584]: 2025-12-06 09:13:17.048524011 +0000 UTC m=+42.168822533 container stop 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, container_name=collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 6 04:13:17 localhost systemd[1]: libpod-01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.scope: Deactivated successfully. Dec 6 04:13:17 localhost systemd[1]: libpod-01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.scope: Consumed 1.915s CPU time. Dec 6 04:13:17 localhost podman[85584]: 2025-12-06 09:13:17.078208536 +0000 UTC m=+42.198507058 container died 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, release=1761123044, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-collectd, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 6 04:13:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.timer: Deactivated successfully. Dec 6 04:13:17 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea. Dec 6 04:13:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Failed to open /run/systemd/transient/01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: No such file or directory Dec 6 04:13:17 localhost systemd[1]: tmp-crun.SKaxO2.mount: Deactivated successfully. Dec 6 04:13:17 localhost podman[85584]: 2025-12-06 09:13:17.136103773 +0000 UTC m=+42.256402305 container cleanup 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, container_name=collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 6 04:13:17 localhost podman[85584]: collectd Dec 6 04:13:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.timer: Failed to open /run/systemd/transient/01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.timer: No such file or directory Dec 6 04:13:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Failed to open /run/systemd/transient/01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: No such file or directory Dec 6 04:13:17 localhost podman[85906]: 2025-12-06 09:13:17.159374891 +0000 UTC m=+0.097108257 container cleanup 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 6 04:13:17 localhost systemd[1]: libpod-conmon-01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.scope: Deactivated successfully. Dec 6 04:13:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12183 DF PROTO=TCP SPT=47890 DPT=9105 SEQ=2590975817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E1CC70000000001030307) Dec 6 04:13:17 localhost podman[85939]: error opening file `/run/crun/01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea/status`: No such file or directory Dec 6 04:13:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.timer: Failed to open /run/systemd/transient/01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.timer: No such file or directory Dec 6 04:13:17 localhost systemd[1]: 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: Failed to open /run/systemd/transient/01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea.service: No such file or directory Dec 6 04:13:17 localhost podman[85927]: 2025-12-06 09:13:17.268320342 +0000 UTC m=+0.075794009 container cleanup 01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '5923e560c9d95c3eb077adacead52760'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1) Dec 6 04:13:17 localhost podman[85927]: collectd Dec 6 04:13:17 localhost systemd[1]: tripleo_collectd.service: Deactivated successfully. Dec 6 04:13:17 localhost systemd[1]: Stopped collectd container. Dec 6 04:13:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:13:17 localhost podman[86032]: 2025-12-06 09:13:17.816710202 +0000 UTC m=+0.086227491 container health_status 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 6 04:13:17 localhost podman[86032]: 2025-12-06 09:13:17.838639749 +0000 UTC m=+0.108157008 container exec_died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:13:17 localhost podman[86032]: unhealthy Dec 6 04:13:17 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:13:17 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed with result 'exit-code'. Dec 6 04:13:18 localhost python3.9[86033]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:18 localhost systemd[1]: var-lib-containers-storage-overlay-913fb92e7d358376526afb98428ee303b126d5d41e5eaa0788a4d91b167c9322-merged.mount: Deactivated successfully. Dec 6 04:13:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01bf87b79239efc00dde9a901efd59a3a24123fc30aeb45370e1ba71c0c51cea-userdata-shm.mount: Deactivated successfully. Dec 6 04:13:18 localhost systemd[1]: Reloading. Dec 6 04:13:18 localhost systemd-rc-local-generator[86082]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:13:18 localhost systemd-sysv-generator[86085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:13:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:13:18 localhost systemd[1]: Stopping iscsid container... Dec 6 04:13:18 localhost systemd[1]: libpod-3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.scope: Deactivated successfully. Dec 6 04:13:18 localhost systemd[1]: libpod-3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.scope: Consumed 1.040s CPU time. Dec 6 04:13:18 localhost podman[86094]: 2025-12-06 09:13:18.481562485 +0000 UTC m=+0.072823128 container died 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.buildah.version=1.41.4, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc.) Dec 6 04:13:18 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.timer: Deactivated successfully. Dec 6 04:13:18 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3. Dec 6 04:13:18 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Failed to open /run/systemd/transient/3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: No such file or directory Dec 6 04:13:18 localhost podman[86094]: 2025-12-06 09:13:18.529060621 +0000 UTC m=+0.120321224 container cleanup 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:13:18 localhost podman[86094]: iscsid Dec 6 04:13:18 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.timer: Failed to open /run/systemd/transient/3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.timer: No such file or directory Dec 6 04:13:18 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Failed to open /run/systemd/transient/3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: No such file or directory Dec 6 04:13:18 localhost podman[86106]: 2025-12-06 09:13:18.554558387 +0000 UTC m=+0.064998256 container cleanup 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, version=17.1.12, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 6 04:13:18 localhost systemd[1]: libpod-conmon-3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.scope: Deactivated successfully. Dec 6 04:13:18 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.timer: Failed to open /run/systemd/transient/3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.timer: No such file or directory Dec 6 04:13:18 localhost systemd[1]: 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: Failed to open /run/systemd/transient/3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3.service: No such file or directory Dec 6 04:13:18 localhost podman[86121]: 2025-12-06 09:13:18.656603066 +0000 UTC m=+0.067219195 container cleanup 3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc.) Dec 6 04:13:18 localhost podman[86121]: iscsid Dec 6 04:13:18 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Dec 6 04:13:18 localhost systemd[1]: Stopped iscsid container. Dec 6 04:13:19 localhost systemd[1]: var-lib-containers-storage-overlay-7b0fc934b7e8ebe947ae2f5b8850ae682bc514513297f85ec62276c836d92b6b-merged.mount: Deactivated successfully. Dec 6 04:13:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a5fe5f34e4df67d6f922d124db6ddf07c9e8807d74c90339b0ada1fcb84b8f3-userdata-shm.mount: Deactivated successfully. Dec 6 04:13:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12184 DF PROTO=TCP SPT=47890 DPT=9105 SEQ=2590975817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E24C70000000001030307) Dec 6 04:13:19 localhost python3.9[86224]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:19 localhost systemd[1]: Reloading. Dec 6 04:13:19 localhost systemd-sysv-generator[86256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:13:19 localhost systemd-rc-local-generator[86251]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:13:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:13:19 localhost systemd[1]: Stopping logrotate_crond container... Dec 6 04:13:19 localhost systemd[1]: libpod-23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.scope: Deactivated successfully. Dec 6 04:13:19 localhost podman[86265]: 2025-12-06 09:13:19.822621882 +0000 UTC m=+0.074651104 container died 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:13:19 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.timer: Deactivated successfully. Dec 6 04:13:19 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da. Dec 6 04:13:19 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Failed to open /run/systemd/transient/23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: No such file or directory Dec 6 04:13:19 localhost podman[86265]: 2025-12-06 09:13:19.885648577 +0000 UTC m=+0.137677769 container cleanup 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:13:19 localhost podman[86265]: logrotate_crond Dec 6 04:13:19 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.timer: Failed to open /run/systemd/transient/23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.timer: No such file or directory Dec 6 04:13:19 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Failed to open /run/systemd/transient/23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: No such file or directory Dec 6 04:13:19 localhost podman[86279]: 2025-12-06 09:13:19.96125611 +0000 UTC m=+0.127033601 container cleanup 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.12, name=rhosp17/openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com) Dec 6 04:13:19 localhost systemd[1]: libpod-conmon-23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.scope: Deactivated successfully. Dec 6 04:13:20 localhost systemd[1]: tmp-crun.EO7Qgx.mount: Deactivated successfully. Dec 6 04:13:20 localhost systemd[1]: var-lib-containers-storage-overlay-6f828ad68b1c8f42bc27f6cd0d2c93f980f96fb70dac87338f926a291c2167ed-merged.mount: Deactivated successfully. Dec 6 04:13:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da-userdata-shm.mount: Deactivated successfully. Dec 6 04:13:20 localhost podman[86304]: error opening file `/run/crun/23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da/status`: No such file or directory Dec 6 04:13:20 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.timer: Failed to open /run/systemd/transient/23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.timer: No such file or directory Dec 6 04:13:20 localhost systemd[1]: 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: Failed to open /run/systemd/transient/23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da.service: No such file or directory Dec 6 04:13:20 localhost podman[86294]: 2025-12-06 09:13:20.072684507 +0000 UTC m=+0.085166418 container cleanup 23930dbe9ed27afba8bbe8df284ef5b876771b65aebc3ab9316d1d03addbc0da (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=logrotate_crond, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true) Dec 6 04:13:20 localhost podman[86294]: logrotate_crond Dec 6 04:13:20 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Dec 6 04:13:20 localhost systemd[1]: Stopped logrotate_crond container. Dec 6 04:13:20 localhost python3.9[86397]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:20 localhost systemd[1]: Reloading. Dec 6 04:13:20 localhost systemd-rc-local-generator[86422]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:13:20 localhost systemd-sysv-generator[86426]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:13:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:13:21 localhost systemd[1]: Stopping metrics_qdr container... Dec 6 04:13:21 localhost systemd[1]: tmp-crun.oiD91W.mount: Deactivated successfully. Dec 6 04:13:21 localhost kernel: qdrouterd[46047]: segfault at 0 ip 00007f9c2d3967cb sp 00007ffc259bdf50 error 4 in libc.so.6[7f9c2d333000+175000] Dec 6 04:13:21 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Dec 6 04:13:21 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Dec 6 04:13:21 localhost systemd[1]: Started Process Core Dump (PID 86452/UID 0). Dec 6 04:13:21 localhost systemd-coredump[86453]: Resource limits disable core dumping for process 46047 (qdrouterd). Dec 6 04:13:21 localhost systemd-coredump[86453]: Process 46047 (qdrouterd) of user 42465 dumped core. Dec 6 04:13:21 localhost systemd[1]: systemd-coredump@0-86452-0.service: Deactivated successfully. Dec 6 04:13:21 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 82.3 (274 of 333 items), suggesting rotation. Dec 6 04:13:21 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 04:13:21 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:13:21 localhost systemd[1]: libpod-f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.scope: Deactivated successfully. Dec 6 04:13:21 localhost systemd[1]: libpod-f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.scope: Consumed 25.986s CPU time. Dec 6 04:13:21 localhost podman[86438]: 2025-12-06 09:13:21.435745294 +0000 UTC m=+0.222104684 container died f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=metrics_qdr) Dec 6 04:13:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.timer: Deactivated successfully. Dec 6 04:13:21 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337. Dec 6 04:13:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Failed to open /run/systemd/transient/f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: No such file or directory Dec 6 04:13:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337-userdata-shm.mount: Deactivated successfully. Dec 6 04:13:21 localhost podman[86438]: 2025-12-06 09:13:21.488415789 +0000 UTC m=+0.274775119 container cleanup f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1) Dec 6 04:13:21 localhost podman[86438]: metrics_qdr Dec 6 04:13:21 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:13:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.timer: Failed to open /run/systemd/transient/f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.timer: No such file or directory Dec 6 04:13:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Failed to open /run/systemd/transient/f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: No such file or directory Dec 6 04:13:21 localhost podman[86457]: 2025-12-06 09:13:21.517914248 +0000 UTC m=+0.076408388 container cleanup f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible) Dec 6 04:13:21 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Dec 6 04:13:21 localhost systemd[1]: libpod-conmon-f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.scope: Deactivated successfully. Dec 6 04:13:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.timer: Failed to open /run/systemd/transient/f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.timer: No such file or directory Dec 6 04:13:21 localhost systemd[1]: f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: Failed to open /run/systemd/transient/f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337.service: No such file or directory Dec 6 04:13:21 localhost podman[86475]: 2025-12-06 09:13:21.612777215 +0000 UTC m=+0.067776111 container cleanup f485a453ecc7dd98d81a93c96a28f0c839eb96952e3a0e6ba296b7c3e525d337 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1d175b5c6581de7cf9d966b234ba0e8a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z', '/etc/pki/tls/certs/metrics_qdr.crt:/var/lib/kolla/config_files/src-tls/etc/pki/tls/certs/metrics_qdr.crt:ro', '/etc/pki/tls/private/metrics_qdr.key:/var/lib/kolla/config_files/src-tls/etc/pki/tls/private/metrics_qdr.key:ro']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Dec 6 04:13:21 localhost podman[86475]: metrics_qdr Dec 6 04:13:21 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Dec 6 04:13:21 localhost systemd[1]: Stopped metrics_qdr container. Dec 6 04:13:22 localhost systemd[1]: var-lib-containers-storage-overlay-e783b457a0dbabf469167b8c7c7fc00f0087efa2180c519aab4a9fcb73c3a343-merged.mount: Deactivated successfully. Dec 6 04:13:22 localhost python3.9[86579]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:23 localhost python3.9[86672]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40696 DF PROTO=TCP SPT=60814 DPT=9882 SEQ=100992451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E346E0000000001030307) Dec 6 04:13:23 localhost python3.9[86765]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:24 localhost python3.9[86858]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:24 localhost systemd[1]: Reloading. Dec 6 04:13:24 localhost systemd-rc-local-generator[86883]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:13:24 localhost systemd-sysv-generator[86889]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:13:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:13:24 localhost systemd[1]: Stopping nova_compute container... Dec 6 04:13:25 localhost systemd[1]: libpod-3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.scope: Deactivated successfully. Dec 6 04:13:25 localhost systemd[1]: libpod-3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.scope: Consumed 21.855s CPU time. Dec 6 04:13:25 localhost systemd[1]: session-c12.scope: Deactivated successfully. Dec 6 04:13:25 localhost systemd[1]: session-c11.scope: Deactivated successfully. Dec 6 04:13:25 localhost podman[86899]: 2025-12-06 09:13:25.220353563 +0000 UTC m=+0.494467507 container died 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, config_id=tripleo_step5, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute) Dec 6 04:13:25 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.timer: Deactivated successfully. Dec 6 04:13:25 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6. Dec 6 04:13:25 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed to open /run/systemd/transient/3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: No such file or directory Dec 6 04:13:25 localhost systemd[1]: tmp-crun.t9gCuM.mount: Deactivated successfully. Dec 6 04:13:25 localhost systemd[1]: var-lib-containers-storage-overlay-ac17782b0a22d6b404309e1e48271f7473fb3586a06782b8c916f0bf1d9c0c8d-merged.mount: Deactivated successfully. Dec 6 04:13:25 localhost podman[86899]: 2025-12-06 09:13:25.28767088 +0000 UTC m=+0.561784714 container cleanup 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:13:25 localhost podman[86899]: nova_compute Dec 6 04:13:25 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.timer: Failed to open /run/systemd/transient/3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.timer: No such file or directory Dec 6 04:13:25 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed to open /run/systemd/transient/3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: No such file or directory Dec 6 04:13:25 localhost podman[86913]: 2025-12-06 09:13:25.354594065 +0000 UTC m=+0.118051743 container cleanup 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Dec 6 04:13:25 localhost systemd[1]: libpod-conmon-3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.scope: Deactivated successfully. Dec 6 04:13:25 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.timer: Failed to open /run/systemd/transient/3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.timer: No such file or directory Dec 6 04:13:25 localhost systemd[1]: 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: Failed to open /run/systemd/transient/3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6.service: No such file or directory Dec 6 04:13:25 localhost podman[86928]: 2025-12-06 09:13:25.458362907 +0000 UTC m=+0.067861475 container cleanup 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 6 04:13:25 localhost podman[86928]: nova_compute Dec 6 04:13:25 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Dec 6 04:13:25 localhost systemd[1]: Stopped nova_compute container. Dec 6 04:13:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43775 DF PROTO=TCP SPT=49680 DPT=9102 SEQ=2280541628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E3D870000000001030307) Dec 6 04:13:26 localhost python3.9[87030]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:27 localhost systemd[1]: Reloading. Dec 6 04:13:27 localhost systemd-sysv-generator[87062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:13:27 localhost systemd-rc-local-generator[87059]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:13:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:13:27 localhost systemd[1]: Stopping nova_migration_target container... Dec 6 04:13:27 localhost systemd[1]: libpod-d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.scope: Deactivated successfully. Dec 6 04:13:27 localhost systemd[1]: libpod-d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.scope: Consumed 32.286s CPU time. Dec 6 04:13:27 localhost podman[87071]: 2025-12-06 09:13:27.753664705 +0000 UTC m=+0.099199621 container died d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, io.openshift.expose-services=) Dec 6 04:13:27 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.timer: Deactivated successfully. Dec 6 04:13:27 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20. Dec 6 04:13:27 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Failed to open /run/systemd/transient/d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: No such file or directory Dec 6 04:13:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20-userdata-shm.mount: Deactivated successfully. Dec 6 04:13:27 localhost systemd[1]: var-lib-containers-storage-overlay-d969762fb0332c5c36abf270d6236af27b60ab1864e2e74a696d11379e3dcdcb-merged.mount: Deactivated successfully. Dec 6 04:13:27 localhost podman[87071]: 2025-12-06 09:13:27.814448991 +0000 UTC m=+0.159983907 container cleanup d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=) Dec 6 04:13:27 localhost podman[87071]: nova_migration_target Dec 6 04:13:27 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.timer: Failed to open /run/systemd/transient/d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.timer: No such file or directory Dec 6 04:13:27 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Failed to open /run/systemd/transient/d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: No such file or directory Dec 6 04:13:27 localhost podman[87086]: 2025-12-06 09:13:27.881693526 +0000 UTC m=+0.116587399 container cleanup d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 6 04:13:27 localhost systemd[1]: libpod-conmon-d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.scope: Deactivated successfully. Dec 6 04:13:27 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.timer: Failed to open /run/systemd/transient/d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.timer: No such file or directory Dec 6 04:13:27 localhost systemd[1]: d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: Failed to open /run/systemd/transient/d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20.service: No such file or directory Dec 6 04:13:27 localhost podman[87099]: 2025-12-06 09:13:27.985606112 +0000 UTC m=+0.068877457 container cleanup d055b0c4cc37216944edcc1843998037126bb6f5a4ff14a90f47b609ba0f2d20 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, container_name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=) Dec 6 04:13:27 localhost podman[87099]: nova_migration_target Dec 6 04:13:27 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Dec 6 04:13:27 localhost systemd[1]: Stopped nova_migration_target container. Dec 6 04:13:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63341 DF PROTO=TCP SPT=51568 DPT=9102 SEQ=459381356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E47880000000001030307) Dec 6 04:13:28 localhost python3.9[87204]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:13:28 localhost systemd[1]: Reloading. Dec 6 04:13:28 localhost systemd-sysv-generator[87234]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:13:28 localhost systemd-rc-local-generator[87229]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:13:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:13:29 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Dec 6 04:13:29 localhost systemd[1]: libpod-8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3.scope: Deactivated successfully. Dec 6 04:13:29 localhost podman[87245]: 2025-12-06 09:13:29.243830732 +0000 UTC m=+0.070932279 container died 8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 04:13:29 localhost podman[87245]: 2025-12-06 09:13:29.277504141 +0000 UTC m=+0.104605698 container cleanup 8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, container_name=nova_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Dec 6 04:13:29 localhost podman[87245]: nova_virtlogd_wrapper Dec 6 04:13:29 localhost podman[87258]: 2025-12-06 09:13:29.322026125 +0000 UTC m=+0.068225526 container cleanup 8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, release=1761123044, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com) Dec 6 04:13:30 localhost systemd[1]: var-lib-containers-storage-overlay-9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26-merged.mount: Deactivated successfully. Dec 6 04:13:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3-userdata-shm.mount: Deactivated successfully. Dec 6 04:13:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16929 DF PROTO=TCP SPT=35512 DPT=9101 SEQ=925659039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E53870000000001030307) Dec 6 04:13:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50511 DF PROTO=TCP SPT=60512 DPT=9101 SEQ=4031597247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E5F880000000001030307) Dec 6 04:13:35 localhost systemd[1]: Stopping User Manager for UID 0... Dec 6 04:13:35 localhost systemd[68170]: Activating special unit Exit the Session... Dec 6 04:13:35 localhost systemd[68170]: Removed slice User Background Tasks Slice. Dec 6 04:13:35 localhost systemd[68170]: Stopped target Main User Target. Dec 6 04:13:35 localhost systemd[68170]: Stopped target Basic System. Dec 6 04:13:35 localhost systemd[68170]: Stopped target Paths. Dec 6 04:13:35 localhost systemd[68170]: Stopped target Sockets. Dec 6 04:13:35 localhost systemd[68170]: Stopped target Timers. Dec 6 04:13:35 localhost systemd[68170]: Stopped Daily Cleanup of User's Temporary Directories. Dec 6 04:13:35 localhost systemd[68170]: Closed D-Bus User Message Bus Socket. Dec 6 04:13:35 localhost systemd[68170]: Stopped Create User's Volatile Files and Directories. Dec 6 04:13:35 localhost systemd[68170]: Removed slice User Application Slice. Dec 6 04:13:35 localhost systemd[68170]: Reached target Shutdown. Dec 6 04:13:35 localhost systemd[68170]: Finished Exit the Session. Dec 6 04:13:35 localhost systemd[68170]: Reached target Exit the Session. Dec 6 04:13:35 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 6 04:13:35 localhost systemd[1]: Stopped User Manager for UID 0. Dec 6 04:13:35 localhost systemd[1]: user@0.service: Consumed 4.062s CPU time, no IO. Dec 6 04:13:35 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 6 04:13:35 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 6 04:13:35 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 6 04:13:35 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 6 04:13:35 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 6 04:13:35 localhost systemd[1]: user-0.slice: Consumed 5.851s CPU time. Dec 6 04:13:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16931 DF PROTO=TCP SPT=35512 DPT=9101 SEQ=925659039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E6B470000000001030307) Dec 6 04:13:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=109 DF PROTO=TCP SPT=57462 DPT=9100 SEQ=1298533895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E76870000000001030307) Dec 6 04:13:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:13:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:13:40 localhost podman[87278]: 2025-12-06 09:13:40.804587666 +0000 UTC m=+0.078661698 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 6 04:13:40 localhost podman[87278]: 2025-12-06 09:13:40.849727189 +0000 UTC m=+0.123801201 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}) Dec 6 04:13:40 localhost podman[87278]: unhealthy Dec 6 04:13:40 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:13:40 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:13:40 localhost podman[87277]: 2025-12-06 09:13:40.851920187 +0000 UTC m=+0.128766814 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1) Dec 6 04:13:40 localhost podman[87277]: 2025-12-06 09:13:40.931659816 +0000 UTC m=+0.208506443 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044) Dec 6 04:13:40 localhost podman[87277]: unhealthy Dec 6 04:13:40 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:13:40 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:13:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16932 DF PROTO=TCP SPT=35512 DPT=9101 SEQ=925659039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E8B880000000001030307) Dec 6 04:13:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28214 DF PROTO=TCP SPT=55588 DPT=9105 SEQ=2171689561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E91C70000000001030307) Dec 6 04:13:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28215 DF PROTO=TCP SPT=55588 DPT=9105 SEQ=2171689561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9E99C70000000001030307) Dec 6 04:13:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28216 DF PROTO=TCP SPT=55588 DPT=9105 SEQ=2171689561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9EA9870000000001030307) Dec 6 04:13:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40701 DF PROTO=TCP SPT=60814 DPT=9882 SEQ=100992451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9EB1870000000001030307) Dec 6 04:14:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17103 DF PROTO=TCP SPT=60970 DPT=9101 SEQ=1415226201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9EC4A70000000001030307) Dec 6 04:14:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17104 DF PROTO=TCP SPT=60970 DPT=9101 SEQ=1415226201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9EC8C70000000001030307) Dec 6 04:14:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=111 DF PROTO=TCP SPT=57462 DPT=9100 SEQ=1298533895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9ED7870000000001030307) Dec 6 04:14:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17106 DF PROTO=TCP SPT=60970 DPT=9101 SEQ=1415226201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9EE0870000000001030307) Dec 6 04:14:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32357 DF PROTO=TCP SPT=34322 DPT=9100 SEQ=1679559576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9EEBC80000000001030307) Dec 6 04:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:14:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:14:11 localhost podman[87319]: 2025-12-06 09:14:11.062029043 +0000 UTC m=+0.088147891 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:14:11 localhost podman[87320]: 2025-12-06 09:14:11.100670106 +0000 UTC m=+0.123181072 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com) Dec 6 04:14:11 localhost podman[87319]: 2025-12-06 09:14:11.108343452 +0000 UTC m=+0.134462250 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent) Dec 6 04:14:11 localhost podman[87319]: unhealthy Dec 6 04:14:11 localhost podman[87320]: 2025-12-06 09:14:11.116588126 +0000 UTC m=+0.139099112 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, container_name=ovn_controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.buildah.version=1.41.4) Dec 6 04:14:11 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:14:11 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:14:11 localhost podman[87320]: unhealthy Dec 6 04:14:11 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:14:11 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:14:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17107 DF PROTO=TCP SPT=60970 DPT=9101 SEQ=1415226201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F01870000000001030307) Dec 6 04:14:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14649 DF PROTO=TCP SPT=60126 DPT=9105 SEQ=739403521 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F07070000000001030307) Dec 6 04:14:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14650 DF PROTO=TCP SPT=60126 DPT=9105 SEQ=739403521 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F0F070000000001030307) Dec 6 04:14:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14651 DF PROTO=TCP SPT=60126 DPT=9105 SEQ=739403521 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F1EC70000000001030307) Dec 6 04:14:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58810 DF PROTO=TCP SPT=54806 DPT=9102 SEQ=1730519424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F27870000000001030307) Dec 6 04:14:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32626 DF PROTO=TCP SPT=38294 DPT=9101 SEQ=4237242618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F39D80000000001030307) Dec 6 04:14:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32627 DF PROTO=TCP SPT=38294 DPT=9101 SEQ=4237242618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F3DC70000000001030307) Dec 6 04:14:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16934 DF PROTO=TCP SPT=35512 DPT=9101 SEQ=925659039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F49880000000001030307) Dec 6 04:14:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 6 04:14:34 localhost recover_tripleo_nova_virtqemud[87360]: 51836 Dec 6 04:14:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 6 04:14:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 6 04:14:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32629 DF PROTO=TCP SPT=38294 DPT=9101 SEQ=4237242618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F55870000000001030307) Dec 6 04:14:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44644 DF PROTO=TCP SPT=49430 DPT=9100 SEQ=2551187359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F60C80000000001030307) Dec 6 04:14:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:14:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:14:41 localhost systemd[1]: tmp-crun.gI5QPq.mount: Deactivated successfully. Dec 6 04:14:41 localhost podman[87361]: 2025-12-06 09:14:41.550590779 +0000 UTC m=+0.083264330 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, version=17.1.12, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 6 04:14:41 localhost podman[87361]: 2025-12-06 09:14:41.568701249 +0000 UTC m=+0.101374800 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z) Dec 6 04:14:41 localhost podman[87361]: unhealthy Dec 6 04:14:41 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:14:41 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:14:41 localhost systemd[1]: tmp-crun.q05n3E.mount: Deactivated successfully. Dec 6 04:14:41 localhost podman[87362]: 2025-12-06 09:14:41.653871636 +0000 UTC m=+0.183572225 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 6 04:14:41 localhost podman[87362]: 2025-12-06 09:14:41.670139498 +0000 UTC m=+0.199840067 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 6 04:14:41 localhost podman[87362]: unhealthy Dec 6 04:14:41 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:14:41 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:14:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32630 DF PROTO=TCP SPT=38294 DPT=9101 SEQ=4237242618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F75880000000001030307) Dec 6 04:14:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21434 DF PROTO=TCP SPT=60168 DPT=9105 SEQ=3134827760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F7C470000000001030307) Dec 6 04:14:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21435 DF PROTO=TCP SPT=60168 DPT=9105 SEQ=3134827760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F84470000000001030307) Dec 6 04:14:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61891 DF PROTO=TCP SPT=60886 DPT=9882 SEQ=3516326409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F93FE0000000001030307) Dec 6 04:14:53 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Dec 6 04:14:53 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 51078 (conmon) with signal SIGKILL. Dec 6 04:14:53 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Dec 6 04:14:53 localhost systemd[1]: libpod-conmon-8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3.scope: Deactivated successfully. Dec 6 04:14:53 localhost podman[87412]: error opening file `/run/crun/8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3/status`: No such file or directory Dec 6 04:14:53 localhost podman[87400]: 2025-12-06 09:14:53.535193599 +0000 UTC m=+0.070310810 container cleanup 8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=nova_virtlogd_wrapper, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git) Dec 6 04:14:53 localhost podman[87400]: nova_virtlogd_wrapper Dec 6 04:14:53 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Dec 6 04:14:53 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Dec 6 04:14:54 localhost python3.9[87505]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:14:54 localhost systemd[1]: Reloading. Dec 6 04:14:54 localhost systemd-sysv-generator[87533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:14:54 localhost systemd-rc-local-generator[87530]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:14:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:14:54 localhost systemd[1]: Stopping nova_virtnodedevd container... Dec 6 04:14:54 localhost systemd[1]: libpod-6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7.scope: Deactivated successfully. Dec 6 04:14:54 localhost systemd[1]: libpod-6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7.scope: Consumed 1.470s CPU time. Dec 6 04:14:54 localhost podman[87546]: 2025-12-06 09:14:54.742959384 +0000 UTC m=+0.078555035 container died 6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt) Dec 6 04:14:54 localhost podman[87546]: 2025-12-06 09:14:54.785058222 +0000 UTC m=+0.120653833 container cleanup 6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtnodedevd, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 04:14:54 localhost podman[87546]: nova_virtnodedevd Dec 6 04:14:54 localhost podman[87561]: 2025-12-06 09:14:54.809678882 +0000 UTC m=+0.055016177 container cleanup 6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.expose-services=, container_name=nova_virtnodedevd, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:14:54 localhost systemd[1]: libpod-conmon-6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7.scope: Deactivated successfully. Dec 6 04:14:54 localhost podman[87590]: error opening file `/run/crun/6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7/status`: No such file or directory Dec 6 04:14:54 localhost podman[87577]: 2025-12-06 09:14:54.909645377 +0000 UTC m=+0.073908912 container cleanup 6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:14:54 localhost podman[87577]: nova_virtnodedevd Dec 6 04:14:54 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Dec 6 04:14:54 localhost systemd[1]: Stopped nova_virtnodedevd container. Dec 6 04:14:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40140 DF PROTO=TCP SPT=56282 DPT=9882 SEQ=3105249060 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9F9B880000000001030307) Dec 6 04:14:55 localhost python3.9[87683]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:14:55 localhost systemd[1]: Reloading. Dec 6 04:14:55 localhost systemd-sysv-generator[87713]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:14:55 localhost systemd-rc-local-generator[87706]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:14:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:14:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7-userdata-shm.mount: Deactivated successfully. Dec 6 04:14:55 localhost systemd[1]: var-lib-containers-storage-overlay-cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0-merged.mount: Deactivated successfully. Dec 6 04:14:56 localhost systemd[1]: Stopping nova_virtproxyd container... Dec 6 04:14:56 localhost systemd[1]: libpod-15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c.scope: Deactivated successfully. Dec 6 04:14:56 localhost podman[87723]: 2025-12-06 09:14:56.124260832 +0000 UTC m=+0.081256788 container died 15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, url=https://www.redhat.com, distribution-scope=public, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, container_name=nova_virtproxyd, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1) Dec 6 04:14:56 localhost podman[87723]: 2025-12-06 09:14:56.165565827 +0000 UTC m=+0.122561763 container cleanup 15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtproxyd, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible) Dec 6 04:14:56 localhost podman[87723]: nova_virtproxyd Dec 6 04:14:56 localhost podman[87737]: 2025-12-06 09:14:56.204660523 +0000 UTC m=+0.069977670 container cleanup 15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, container_name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Dec 6 04:14:56 localhost systemd[1]: libpod-conmon-15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c.scope: Deactivated successfully. Dec 6 04:14:56 localhost podman[87766]: error opening file `/run/crun/15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c/status`: No such file or directory Dec 6 04:14:56 localhost podman[87755]: 2025-12-06 09:14:56.308334292 +0000 UTC m=+0.069538377 container cleanup 15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 6 04:14:56 localhost podman[87755]: nova_virtproxyd Dec 6 04:14:56 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Dec 6 04:14:56 localhost systemd[1]: Stopped nova_virtproxyd container. Dec 6 04:14:57 localhost systemd[1]: var-lib-containers-storage-overlay-fda45022fd096c249e88499c11ec7e672819ceb78de5334d85a7da4061f3fd35-merged.mount: Deactivated successfully. Dec 6 04:14:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15fcd6d768c7a0133c53d58a3313883ac231e88089e7d1f512e2d2163c52986c-userdata-shm.mount: Deactivated successfully. Dec 6 04:14:57 localhost python3.9[87861]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:14:57 localhost systemd[1]: Reloading. Dec 6 04:14:57 localhost systemd-sysv-generator[87892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:14:57 localhost systemd-rc-local-generator[87885]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:14:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:14:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Dec 6 04:14:57 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Dec 6 04:14:57 localhost systemd[1]: Stopping nova_virtqemud container... Dec 6 04:14:57 localhost systemd[1]: tmp-crun.0Dw5K3.mount: Deactivated successfully. Dec 6 04:14:57 localhost systemd[1]: libpod-aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b.scope: Deactivated successfully. Dec 6 04:14:57 localhost systemd[1]: libpod-aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b.scope: Consumed 2.839s CPU time. Dec 6 04:14:57 localhost podman[87902]: 2025-12-06 09:14:57.478692532 +0000 UTC m=+0.078550395 container died aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=nova_virtqemud, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_id=tripleo_step3) Dec 6 04:14:57 localhost podman[87902]: 2025-12-06 09:14:57.50748161 +0000 UTC m=+0.107339503 container cleanup aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:35:22Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtqemud, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team) Dec 6 04:14:57 localhost podman[87902]: nova_virtqemud Dec 6 04:14:57 localhost podman[87917]: 2025-12-06 09:14:57.557236565 +0000 UTC m=+0.069277139 container cleanup aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_virtqemud, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:14:58 localhost systemd[1]: tmp-crun.7Vi7Hp.mount: Deactivated successfully. Dec 6 04:14:58 localhost systemd[1]: var-lib-containers-storage-overlay-d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e-merged.mount: Deactivated successfully. Dec 6 04:14:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b-userdata-shm.mount: Deactivated successfully. Dec 6 04:14:58 localhost sshd[87933]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:15:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37315 DF PROTO=TCP SPT=54904 DPT=9101 SEQ=547390761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FAF070000000001030307) Dec 6 04:15:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37316 DF PROTO=TCP SPT=54904 DPT=9101 SEQ=547390761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FB3070000000001030307) Dec 6 04:15:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17109 DF PROTO=TCP SPT=60970 DPT=9101 SEQ=1415226201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FBF870000000001030307) Dec 6 04:15:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37318 DF PROTO=TCP SPT=54904 DPT=9101 SEQ=547390761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FCAC70000000001030307) Dec 6 04:15:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31349 DF PROTO=TCP SPT=42280 DPT=9100 SEQ=2544864788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FD6070000000001030307) Dec 6 04:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:15:12 localhost systemd[1]: tmp-crun.EfZBXT.mount: Deactivated successfully. Dec 6 04:15:12 localhost podman[87936]: 2025-12-06 09:15:12.048696952 +0000 UTC m=+0.079324838 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:15:12 localhost podman[87936]: 2025-12-06 09:15:12.089320125 +0000 UTC m=+0.119948011 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ovn-controller-container) Dec 6 04:15:12 localhost podman[87936]: unhealthy Dec 6 04:15:12 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:15:12 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:15:12 localhost podman[87935]: 2025-12-06 09:15:12.097139536 +0000 UTC m=+0.129688682 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_metadata_agent, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:15:12 localhost podman[87935]: 2025-12-06 09:15:12.183370227 +0000 UTC m=+0.215919383 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044) Dec 6 04:15:12 localhost podman[87935]: unhealthy Dec 6 04:15:12 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:15:12 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:15:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37319 DF PROTO=TCP SPT=54904 DPT=9101 SEQ=547390761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FEB870000000001030307) Dec 6 04:15:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5641 DF PROTO=TCP SPT=35248 DPT=9105 SEQ=4222445103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FF1870000000001030307) Dec 6 04:15:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5642 DF PROTO=TCP SPT=35248 DPT=9105 SEQ=4222445103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D9FF9870000000001030307) Dec 6 04:15:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56606 DF PROTO=TCP SPT=56572 DPT=9882 SEQ=246840892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0092F0000000001030307) Dec 6 04:15:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35965 DF PROTO=TCP SPT=47936 DPT=9102 SEQ=2694198216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA013870000000001030307) Dec 6 04:15:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52043 DF PROTO=TCP SPT=52804 DPT=9101 SEQ=3670279186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA024370000000001030307) Dec 6 04:15:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52044 DF PROTO=TCP SPT=52804 DPT=9101 SEQ=3670279186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA028470000000001030307) Dec 6 04:15:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31351 DF PROTO=TCP SPT=42280 DPT=9100 SEQ=2544864788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA035870000000001030307) Dec 6 04:15:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44647 DF PROTO=TCP SPT=49430 DPT=9100 SEQ=2551187359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA03F870000000001030307) Dec 6 04:15:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59851 DF PROTO=TCP SPT=56124 DPT=9100 SEQ=3954743630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA04B470000000001030307) Dec 6 04:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:15:42 localhost podman[87972]: 2025-12-06 09:15:42.290099365 +0000 UTC m=+0.075674846 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4) Dec 6 04:15:42 localhost podman[87973]: 2025-12-06 09:15:42.347270459 +0000 UTC m=+0.127792604 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, tcib_managed=true) Dec 6 04:15:42 localhost podman[87973]: 2025-12-06 09:15:42.361039173 +0000 UTC m=+0.141561318 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z) Dec 6 04:15:42 localhost podman[87973]: unhealthy Dec 6 04:15:42 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:15:42 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:15:42 localhost podman[87972]: 2025-12-06 09:15:42.376114289 +0000 UTC m=+0.161689770 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:15:42 localhost podman[87972]: unhealthy Dec 6 04:15:42 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:15:42 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:15:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52047 DF PROTO=TCP SPT=52804 DPT=9101 SEQ=3670279186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA05F870000000001030307) Dec 6 04:15:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8852 DF PROTO=TCP SPT=56998 DPT=9105 SEQ=506522403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA066870000000001030307) Dec 6 04:15:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8853 DF PROTO=TCP SPT=56998 DPT=9105 SEQ=506522403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA06E870000000001030307) Dec 6 04:15:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8854 DF PROTO=TCP SPT=56998 DPT=9105 SEQ=506522403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA07E470000000001030307) Dec 6 04:15:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36921 DF PROTO=TCP SPT=41894 DPT=9102 SEQ=841108218 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA087870000000001030307) Dec 6 04:15:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35966 DF PROTO=TCP SPT=47936 DPT=9102 SEQ=2694198216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA091870000000001030307) Dec 6 04:16:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20 DF PROTO=TCP SPT=59722 DPT=9101 SEQ=2931671756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA09D870000000001030307) Dec 6 04:16:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37321 DF PROTO=TCP SPT=54904 DPT=9101 SEQ=547390761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0A9870000000001030307) Dec 6 04:16:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22 DF PROTO=TCP SPT=59722 DPT=9101 SEQ=2931671756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0B5470000000001030307) Dec 6 04:16:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15405 DF PROTO=TCP SPT=41200 DPT=9100 SEQ=3308270244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0C0870000000001030307) Dec 6 04:16:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:16:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:16:12 localhost systemd[1]: tmp-crun.QMSvYP.mount: Deactivated successfully. Dec 6 04:16:12 localhost podman[88013]: 2025-12-06 09:16:12.566710185 +0000 UTC m=+0.092573417 container health_status 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Dec 6 04:16:12 localhost podman[88014]: 2025-12-06 09:16:12.617282896 +0000 UTC m=+0.138546567 container health_status 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, container_name=ovn_controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Dec 6 04:16:12 localhost podman[88013]: 2025-12-06 09:16:12.63913973 +0000 UTC m=+0.165002992 container exec_died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 6 04:16:12 localhost podman[88013]: unhealthy Dec 6 04:16:12 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:16:12 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed with result 'exit-code'. Dec 6 04:16:12 localhost podman[88014]: 2025-12-06 09:16:12.662319815 +0000 UTC m=+0.183583506 container exec_died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12) Dec 6 04:16:12 localhost podman[88014]: unhealthy Dec 6 04:16:12 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:16:12 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed with result 'exit-code'. Dec 6 04:16:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23 DF PROTO=TCP SPT=59722 DPT=9101 SEQ=2931671756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0D5870000000001030307) Dec 6 04:16:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53970 DF PROTO=TCP SPT=41382 DPT=9105 SEQ=2126319707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0D7C60000000001030307) Dec 6 04:16:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53972 DF PROTO=TCP SPT=41382 DPT=9105 SEQ=2126319707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0E3C70000000001030307) Dec 6 04:16:21 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Dec 6 04:16:21 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 51832 (conmon) with signal SIGKILL. Dec 6 04:16:21 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Dec 6 04:16:21 localhost systemd[1]: libpod-conmon-aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b.scope: Deactivated successfully. Dec 6 04:16:21 localhost podman[88064]: error opening file `/run/crun/aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b/status`: No such file or directory Dec 6 04:16:21 localhost podman[88052]: 2025-12-06 09:16:21.784905421 +0000 UTC m=+0.067921496 container cleanup aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, container_name=nova_virtqemud, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step3) Dec 6 04:16:21 localhost podman[88052]: nova_virtqemud Dec 6 04:16:21 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Dec 6 04:16:21 localhost systemd[1]: Stopped nova_virtqemud container. Dec 6 04:16:22 localhost python3.9[88157]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:16:22 localhost systemd[1]: Reloading. Dec 6 04:16:22 localhost systemd-rc-local-generator[88182]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:16:22 localhost systemd-sysv-generator[88188]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:16:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:16:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53973 DF PROTO=TCP SPT=41382 DPT=9105 SEQ=2126319707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0F3880000000001030307) Dec 6 04:16:23 localhost python3.9[88287]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:16:23 localhost systemd[1]: Reloading. Dec 6 04:16:23 localhost systemd-rc-local-generator[88315]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:16:23 localhost systemd-sysv-generator[88318]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:16:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:16:23 localhost systemd[1]: Stopping nova_virtsecretd container... Dec 6 04:16:24 localhost systemd[1]: libpod-91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9.scope: Deactivated successfully. Dec 6 04:16:24 localhost podman[88327]: 2025-12-06 09:16:24.050467923 +0000 UTC m=+0.088242613 container died 91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtsecretd, url=https://www.redhat.com) Dec 6 04:16:24 localhost podman[88327]: 2025-12-06 09:16:24.086786403 +0000 UTC m=+0.124561093 container cleanup 91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtsecretd, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:16:24 localhost podman[88327]: nova_virtsecretd Dec 6 04:16:24 localhost podman[88340]: 2025-12-06 09:16:24.130096409 +0000 UTC m=+0.066278855 container cleanup 91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtsecretd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, io.openshift.expose-services=, release=1761123044, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 6 04:16:24 localhost systemd[1]: libpod-conmon-91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9.scope: Deactivated successfully. Dec 6 04:16:24 localhost podman[88367]: error opening file `/run/crun/91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9/status`: No such file or directory Dec 6 04:16:24 localhost podman[88355]: 2025-12-06 09:16:24.231287812 +0000 UTC m=+0.071284811 container cleanup 91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:35:22Z, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, container_name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3) Dec 6 04:16:24 localhost podman[88355]: nova_virtsecretd Dec 6 04:16:24 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Dec 6 04:16:24 localhost systemd[1]: Stopped nova_virtsecretd container. Dec 6 04:16:24 localhost python3.9[88460]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:16:25 localhost systemd[1]: var-lib-containers-storage-overlay-997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e-merged.mount: Deactivated successfully. Dec 6 04:16:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91e9e235742d9890416eb86d2414e0c475955c1537e4c3b32259a8fe13a4a8d9-userdata-shm.mount: Deactivated successfully. Dec 6 04:16:25 localhost systemd[1]: Reloading. Dec 6 04:16:25 localhost systemd-sysv-generator[88490]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:16:25 localhost systemd-rc-local-generator[88485]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:16:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:16:25 localhost systemd[1]: Stopping nova_virtstoraged container... Dec 6 04:16:25 localhost systemd[1]: libpod-ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7.scope: Deactivated successfully. Dec 6 04:16:25 localhost podman[88501]: 2025-12-06 09:16:25.427571012 +0000 UTC m=+0.081856757 container died ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, container_name=nova_virtstoraged, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 04:16:25 localhost podman[88501]: 2025-12-06 09:16:25.471771565 +0000 UTC m=+0.126057280 container cleanup ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtstoraged, config_id=tripleo_step3, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}) Dec 6 04:16:25 localhost podman[88501]: nova_virtstoraged Dec 6 04:16:25 localhost podman[88514]: 2025-12-06 09:16:25.505912879 +0000 UTC m=+0.067576467 container cleanup ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12) Dec 6 04:16:25 localhost systemd[1]: libpod-conmon-ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7.scope: Deactivated successfully. Dec 6 04:16:25 localhost podman[88540]: error opening file `/run/crun/ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7/status`: No such file or directory Dec 6 04:16:25 localhost podman[88529]: 2025-12-06 09:16:25.619444352 +0000 UTC m=+0.071066874 container cleanup ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'f81b1d391c9b63868054d7733e636be7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/etc/pki/CA/cacert.pem:/etc/pki/CA/cacert.pem:ro', '/etc/pki/libvirt:/etc/pki/libvirt:ro', '/etc/pki/qemu:/etc/pki/qemu:ro', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, version=17.1.12, distribution-scope=public) Dec 6 04:16:25 localhost podman[88529]: nova_virtstoraged Dec 6 04:16:25 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Dec 6 04:16:25 localhost systemd[1]: Stopped nova_virtstoraged container. Dec 6 04:16:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1965 DF PROTO=TCP SPT=51840 DPT=9102 SEQ=170741534 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA0FD870000000001030307) Dec 6 04:16:26 localhost systemd[1]: var-lib-containers-storage-overlay-5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb-merged.mount: Deactivated successfully. Dec 6 04:16:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac07734ab905edb0a5266ce836cb5a7a21f7738f90c0de8a1f302a161af8d9f7-userdata-shm.mount: Deactivated successfully. Dec 6 04:16:26 localhost python3.9[88634]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:16:26 localhost systemd[1]: Reloading. Dec 6 04:16:26 localhost systemd-rc-local-generator[88659]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:16:26 localhost systemd-sysv-generator[88662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:16:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:16:26 localhost systemd[1]: Stopping ovn_controller container... Dec 6 04:16:26 localhost systemd[1]: libpod-2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.scope: Deactivated successfully. Dec 6 04:16:26 localhost systemd[1]: libpod-2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.scope: Consumed 2.477s CPU time. Dec 6 04:16:26 localhost podman[88674]: 2025-12-06 09:16:26.845402468 +0000 UTC m=+0.069428614 container died 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller) Dec 6 04:16:26 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.timer: Deactivated successfully. Dec 6 04:16:26 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120. Dec 6 04:16:26 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed to open /run/systemd/transient/2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: No such file or directory Dec 6 04:16:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120-userdata-shm.mount: Deactivated successfully. Dec 6 04:16:26 localhost podman[88674]: 2025-12-06 09:16:26.890180299 +0000 UTC m=+0.114206385 container cleanup 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller) Dec 6 04:16:26 localhost podman[88674]: ovn_controller Dec 6 04:16:26 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.timer: Failed to open /run/systemd/transient/2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.timer: No such file or directory Dec 6 04:16:26 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed to open /run/systemd/transient/2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: No such file or directory Dec 6 04:16:26 localhost podman[88686]: 2025-12-06 09:16:26.924530219 +0000 UTC m=+0.070291830 container cleanup 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1) Dec 6 04:16:26 localhost systemd[1]: libpod-conmon-2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.scope: Deactivated successfully. Dec 6 04:16:27 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.timer: Failed to open /run/systemd/transient/2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.timer: No such file or directory Dec 6 04:16:27 localhost systemd[1]: 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: Failed to open /run/systemd/transient/2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120.service: No such file or directory Dec 6 04:16:27 localhost podman[88703]: 2025-12-06 09:16:27.025745421 +0000 UTC m=+0.071027782 container cleanup 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=ovn_controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Dec 6 04:16:27 localhost podman[88703]: ovn_controller Dec 6 04:16:27 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Dec 6 04:16:27 localhost systemd[1]: Stopped ovn_controller container. Dec 6 04:16:27 localhost systemd[1]: var-lib-containers-storage-overlay-a4e58f94b8958f513b2dd393cee5d54d098de4336f81d16f69b3743ebfd6afda-merged.mount: Deactivated successfully. Dec 6 04:16:27 localhost python3.9[88805]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:16:28 localhost systemd[1]: Reloading. Dec 6 04:16:28 localhost systemd-rc-local-generator[88834]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:16:28 localhost systemd-sysv-generator[88837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:16:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:16:29 localhost systemd[1]: Stopping ovn_metadata_agent container... Dec 6 04:16:29 localhost systemd[1]: libpod-2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.scope: Deactivated successfully. Dec 6 04:16:29 localhost systemd[1]: libpod-2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.scope: Consumed 12.457s CPU time. Dec 6 04:16:29 localhost podman[88846]: 2025-12-06 09:16:29.938723408 +0000 UTC m=+0.784109464 container died 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:16:29 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.timer: Deactivated successfully. Dec 6 04:16:29 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368. Dec 6 04:16:29 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed to open /run/systemd/transient/2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: No such file or directory Dec 6 04:16:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368-userdata-shm.mount: Deactivated successfully. Dec 6 04:16:29 localhost systemd[1]: var-lib-containers-storage-overlay-b249cf7e6e009bf19a7258344bcf98894c9eab8ad3921e68b5bede5938188687-merged.mount: Deactivated successfully. Dec 6 04:16:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29053 DF PROTO=TCP SPT=44904 DPT=9101 SEQ=108748421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA10E980000000001030307) Dec 6 04:16:30 localhost podman[88846]: 2025-12-06 09:16:30.283291069 +0000 UTC m=+1.128677075 container cleanup 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4) Dec 6 04:16:30 localhost podman[88846]: ovn_metadata_agent Dec 6 04:16:30 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.timer: Failed to open /run/systemd/transient/2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.timer: No such file or directory Dec 6 04:16:30 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed to open /run/systemd/transient/2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: No such file or directory Dec 6 04:16:30 localhost podman[88859]: 2025-12-06 09:16:30.309820107 +0000 UTC m=+0.361002298 container cleanup 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:16:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29054 DF PROTO=TCP SPT=44904 DPT=9101 SEQ=108748421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA112870000000001030307) Dec 6 04:16:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15407 DF PROTO=TCP SPT=41200 DPT=9100 SEQ=3308270244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA121870000000001030307) Dec 6 04:16:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29056 DF PROTO=TCP SPT=44904 DPT=9101 SEQ=108748421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA12A470000000001030307) Dec 6 04:16:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65329 DF PROTO=TCP SPT=44794 DPT=9100 SEQ=481179307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA135870000000001030307) Dec 6 04:16:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29057 DF PROTO=TCP SPT=44904 DPT=9101 SEQ=108748421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA14B870000000001030307) Dec 6 04:16:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25529 DF PROTO=TCP SPT=54904 DPT=9105 SEQ=3086984034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA151070000000001030307) Dec 6 04:16:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25530 DF PROTO=TCP SPT=54904 DPT=9105 SEQ=3086984034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA159070000000001030307) Dec 6 04:16:51 localhost sshd[88877]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:16:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48800 DF PROTO=TCP SPT=41454 DPT=9882 SEQ=4065894378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA168BE0000000001030307) Dec 6 04:16:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14171 DF PROTO=TCP SPT=56002 DPT=9102 SEQ=1285277003 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA171870000000001030307) Dec 6 04:17:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57161 DF PROTO=TCP SPT=34402 DPT=9101 SEQ=4177150148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA183C70000000001030307) Dec 6 04:17:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57162 DF PROTO=TCP SPT=34402 DPT=9101 SEQ=4177150148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA187C80000000001030307) Dec 6 04:17:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25 DF PROTO=TCP SPT=59722 DPT=9101 SEQ=2931671756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA193880000000001030307) Dec 6 04:17:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15408 DF PROTO=TCP SPT=41200 DPT=9100 SEQ=3308270244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA19F870000000001030307) Dec 6 04:17:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13907 DF PROTO=TCP SPT=49802 DPT=9100 SEQ=2461637300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1AAC70000000001030307) Dec 6 04:17:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57165 DF PROTO=TCP SPT=34402 DPT=9101 SEQ=4177150148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1BF880000000001030307) Dec 6 04:17:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64600 DF PROTO=TCP SPT=43764 DPT=9105 SEQ=1870075376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1C6480000000001030307) Dec 6 04:17:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64601 DF PROTO=TCP SPT=43764 DPT=9105 SEQ=1870075376 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1CE480000000001030307) Dec 6 04:17:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47241 DF PROTO=TCP SPT=45864 DPT=9882 SEQ=4035826835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1DDEE0000000001030307) Dec 6 04:17:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48805 DF PROTO=TCP SPT=41454 DPT=9882 SEQ=4065894378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1E5870000000001030307) Dec 6 04:17:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31838 DF PROTO=TCP SPT=42096 DPT=9101 SEQ=1389676663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1F8F70000000001030307) Dec 6 04:17:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31839 DF PROTO=TCP SPT=42096 DPT=9101 SEQ=1389676663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA1FD070000000001030307) Dec 6 04:17:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29059 DF PROTO=TCP SPT=44904 DPT=9101 SEQ=108748421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA209870000000001030307) Dec 6 04:17:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31841 DF PROTO=TCP SPT=42096 DPT=9101 SEQ=1389676663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA214C70000000001030307) Dec 6 04:17:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29692 DF PROTO=TCP SPT=38102 DPT=9100 SEQ=997637314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA220070000000001030307) Dec 6 04:17:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31842 DF PROTO=TCP SPT=42096 DPT=9101 SEQ=1389676663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA235870000000001030307) Dec 6 04:17:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52954 DF PROTO=TCP SPT=59522 DPT=9105 SEQ=1754857025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA23B470000000001030307) Dec 6 04:17:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52955 DF PROTO=TCP SPT=59522 DPT=9105 SEQ=1754857025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA243470000000001030307) Dec 6 04:17:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52956 DF PROTO=TCP SPT=59522 DPT=9105 SEQ=1754857025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA253070000000001030307) Dec 6 04:17:54 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Dec 6 04:17:54 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 58781 (conmon) with signal SIGKILL. Dec 6 04:17:54 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Dec 6 04:17:54 localhost systemd[1]: libpod-conmon-2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.scope: Deactivated successfully. Dec 6 04:17:54 localhost podman[88888]: error opening file `/run/crun/2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368/status`: No such file or directory Dec 6 04:17:54 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.timer: Failed to open /run/systemd/transient/2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.timer: No such file or directory Dec 6 04:17:54 localhost systemd[1]: 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: Failed to open /run/systemd/transient/2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368.service: No such file or directory Dec 6 04:17:54 localhost podman[88879]: 2025-12-06 09:17:54.547262176 +0000 UTC m=+0.077133660 container cleanup 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:17:54 localhost podman[88879]: ovn_metadata_agent Dec 6 04:17:54 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Dec 6 04:17:54 localhost systemd[1]: Stopped ovn_metadata_agent container. Dec 6 04:17:55 localhost python3.9[88983]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:17:55 localhost systemd[1]: Reloading. Dec 6 04:17:55 localhost systemd-sysv-generator[89017]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:17:55 localhost systemd-rc-local-generator[89012]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:17:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:17:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46047 DF PROTO=TCP SPT=54362 DPT=9102 SEQ=2732249616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA25D870000000001030307) Dec 6 04:17:57 localhost python3.9[89114]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:17:58 localhost python3.9[89206]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:17:58 localhost python3.9[89298]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:17:59 localhost python3.9[89390]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:17:59 localhost python3.9[89482]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35993 DF PROTO=TCP SPT=40212 DPT=9101 SEQ=4127260835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA26E270000000001030307) Dec 6 04:18:00 localhost python3.9[89574]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:00 localhost python3.9[89666]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35994 DF PROTO=TCP SPT=40212 DPT=9101 SEQ=4127260835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA272470000000001030307) Dec 6 04:18:01 localhost python3.9[89758]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:01 localhost python3.9[89850]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:02 localhost python3.9[89942]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:03 localhost python3.9[90034]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:03 localhost python3.9[90126]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:04 localhost python3.9[90218]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29694 DF PROTO=TCP SPT=38102 DPT=9100 SEQ=997637314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA27F870000000001030307) Dec 6 04:18:04 localhost python3.9[90310]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:05 localhost python3.9[90402]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:05 localhost python3.9[90494]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:06 localhost python3.9[90586]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:07 localhost python3.9[90678]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13910 DF PROTO=TCP SPT=49802 DPT=9100 SEQ=2461637300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA289880000000001030307) Dec 6 04:18:07 localhost python3.9[90770]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:08 localhost python3.9[90862]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:08 localhost python3.9[90954]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:10 localhost python3.9[91046]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32831 DF PROTO=TCP SPT=36062 DPT=9100 SEQ=1533550107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA295470000000001030307) Dec 6 04:18:10 localhost python3.9[91138]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:11 localhost python3.9[91230]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:11 localhost python3.9[91322]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:12 localhost python3.9[91414]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:12 localhost python3.9[91506]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:13 localhost python3.9[91598]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:13 localhost python3.9[91690]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:14 localhost python3.9[91782]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:15 localhost python3.9[91874]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35997 DF PROTO=TCP SPT=40212 DPT=9101 SEQ=4127260835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2A9880000000001030307) Dec 6 04:18:15 localhost python3.9[91966]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:16 localhost python3.9[92058]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:16 localhost python3.9[92151]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58327 DF PROTO=TCP SPT=33336 DPT=9105 SEQ=811471407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2B0880000000001030307) Dec 6 04:18:17 localhost python3.9[92243]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:18 localhost python3.9[92335]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:18 localhost python3.9[92427]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58328 DF PROTO=TCP SPT=33336 DPT=9105 SEQ=811471407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2B8870000000001030307) Dec 6 04:18:19 localhost python3.9[92519]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:19 localhost python3.9[92611]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:20 localhost python3.9[92703]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:21 localhost python3.9[92795]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:21 localhost python3.9[92887]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:18:22 localhost python3.9[92979]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:22 localhost systemd[1]: Reloading. Dec 6 04:18:22 localhost systemd-sysv-generator[93007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:18:22 localhost systemd-rc-local-generator[93003]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:18:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:18:22 localhost systemd[1]: Stopping Certificate monitoring and PKI enrollment... Dec 6 04:18:22 localhost certmonger[37540]: 2025-12-06 09:18:22 [37540] Wrote to /var/lib/certmonger/requests/20251206080741 Dec 6 04:18:22 localhost certmonger[37540]: 2025-12-06 09:18:22 [37540] Wrote to /var/lib/certmonger/requests/20251206080809 Dec 6 04:18:22 localhost certmonger[37540]: 2025-12-06 09:18:22 [37540] Wrote to /var/lib/certmonger/requests/20251206080811 Dec 6 04:18:22 localhost certmonger[37540]: 2025-12-06 09:18:22 [37540] Wrote to /var/lib/certmonger/requests/20251206080812 Dec 6 04:18:22 localhost certmonger[37540]: 2025-12-06 09:18:22 [37540] Wrote to /var/lib/certmonger/requests/20251206080814 Dec 6 04:18:22 localhost certmonger[37540]: 2025-12-06 09:18:22 [37540] Wrote to /var/lib/certmonger/requests/20251206080943 Dec 6 04:18:22 localhost certmonger[37540]: 2025-12-06 09:18:22 [37540] Wrote to /var/lib/certmonger/requests/20251206080952 Dec 6 04:18:22 localhost systemd[1]: certmonger.service: Deactivated successfully. Dec 6 04:18:22 localhost systemd[1]: Stopped Certificate monitoring and PKI enrollment. Dec 6 04:18:22 localhost systemd[1]: certmonger.service: Consumed 3.492s CPU time, read 24.0K from disk, written 226.0K to disk. Dec 6 04:18:22 localhost systemd[1]: Reloading. Dec 6 04:18:22 localhost systemd-rc-local-generator[93041]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:18:22 localhost systemd-sysv-generator[93044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:18:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:18:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58329 DF PROTO=TCP SPT=33336 DPT=9105 SEQ=811471407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2C8480000000001030307) Dec 6 04:18:23 localhost python3.9[93142]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:18:24 localhost python3.9[93234]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:18:24 localhost systemd[1]: Reloading. Dec 6 04:18:24 localhost systemd-rc-local-generator[93261]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:18:24 localhost systemd-sysv-generator[93264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:18:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:18:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12578 DF PROTO=TCP SPT=41854 DPT=9102 SEQ=3037776953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2D1870000000001030307) Dec 6 04:18:25 localhost python3.9[93361]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:26 localhost python3.9[93454]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:27 localhost python3.9[93547]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:27 localhost python3.9[93640]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46048 DF PROTO=TCP SPT=54362 DPT=9102 SEQ=2732249616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2DB880000000001030307) Dec 6 04:18:28 localhost python3.9[93733]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:28 localhost python3.9[93826]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:29 localhost python3.9[93919]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:30 localhost python3.9[94012]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:30 localhost python3.9[94105]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16112 DF PROTO=TCP SPT=56698 DPT=9101 SEQ=4057653698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2E7470000000001030307) Dec 6 04:18:31 localhost python3.9[94198]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:32 localhost python3.9[94291]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:33 localhost python3.9[94384]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:34 localhost python3.9[94477]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31844 DF PROTO=TCP SPT=42096 DPT=9101 SEQ=1389676663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2F3880000000001030307) Dec 6 04:18:34 localhost python3.9[94570]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:35 localhost python3.9[94663]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:35 localhost python3.9[94756]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:36 localhost python3.9[94849]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:37 localhost python3.9[94942]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16114 DF PROTO=TCP SPT=56698 DPT=9101 SEQ=4057653698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA2FF080000000001030307) Dec 6 04:18:37 localhost python3.9[95035]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:38 localhost python3.9[95128]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:38 localhost python3.9[95221]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:18:39 localhost systemd[1]: session-24.scope: Deactivated successfully. Dec 6 04:18:39 localhost systemd[1]: session-24.scope: Consumed 49.206s CPU time. Dec 6 04:18:39 localhost systemd-logind[760]: Session 24 logged out. Waiting for processes to exit. Dec 6 04:18:39 localhost systemd-logind[760]: Removed session 24. Dec 6 04:18:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58603 DF PROTO=TCP SPT=39746 DPT=9100 SEQ=1282717544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA30A470000000001030307) Dec 6 04:18:41 localhost sshd[95238]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:18:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16115 DF PROTO=TCP SPT=56698 DPT=9101 SEQ=4057653698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA31F870000000001030307) Dec 6 04:18:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53036 DF PROTO=TCP SPT=53662 DPT=9105 SEQ=2692769804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA325C70000000001030307) Dec 6 04:18:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53037 DF PROTO=TCP SPT=53662 DPT=9105 SEQ=2692769804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA32EF40000000001030307) Dec 6 04:18:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63978 DF PROTO=TCP SPT=41894 DPT=9882 SEQ=1849137127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA33D830000000001030307) Dec 6 04:18:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32126 DF PROTO=TCP SPT=59738 DPT=9102 SEQ=3271296359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA347880000000001030307) Dec 6 04:19:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10858 DF PROTO=TCP SPT=59182 DPT=9101 SEQ=1084263589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA358870000000001030307) Dec 6 04:19:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10859 DF PROTO=TCP SPT=59182 DPT=9101 SEQ=1084263589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA35C880000000001030307) Dec 6 04:19:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58605 DF PROTO=TCP SPT=39746 DPT=9100 SEQ=1282717544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA36B870000000001030307) Dec 6 04:19:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10861 DF PROTO=TCP SPT=59182 DPT=9101 SEQ=1084263589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA374470000000001030307) Dec 6 04:19:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31958 DF PROTO=TCP SPT=47254 DPT=9100 SEQ=604437006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA37F870000000001030307) Dec 6 04:19:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10862 DF PROTO=TCP SPT=59182 DPT=9101 SEQ=1084263589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA395870000000001030307) Dec 6 04:19:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38511 DF PROTO=TCP SPT=51316 DPT=9105 SEQ=3597278374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA39B070000000001030307) Dec 6 04:19:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38512 DF PROTO=TCP SPT=51316 DPT=9105 SEQ=3597278374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3A3070000000001030307) Dec 6 04:19:21 localhost sshd[95240]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:19:22 localhost systemd-logind[760]: New session 25 of user zuul. Dec 6 04:19:22 localhost systemd[1]: Started Session 25 of User zuul. Dec 6 04:19:22 localhost python3.9[95333]: ansible-ansible.legacy.ping Invoked with data=pong Dec 6 04:19:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38568 DF PROTO=TCP SPT=43360 DPT=9882 SEQ=3219689434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3B2AE0000000001030307) Dec 6 04:19:23 localhost python3.9[95437]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:19:24 localhost python3.9[95529]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:19:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60499 DF PROTO=TCP SPT=36852 DPT=9102 SEQ=3036200249 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3BB870000000001030307) Dec 6 04:19:25 localhost python3.9[95622]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:19:26 localhost python3.9[95714]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:19:27 localhost python3.9[95806]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:19:27 localhost python3.9[95879]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012766.604266-177-33169726535702/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:19:28 localhost python3.9[95971]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:19:29 localhost python3.9[96067]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:19:29 localhost python3.9[96159]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:19:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43890 DF PROTO=TCP SPT=49514 DPT=9101 SEQ=3678308428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3CDB70000000001030307) Dec 6 04:19:30 localhost python3.9[96249]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:19:30 localhost network[96266]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:19:30 localhost network[96267]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:19:30 localhost network[96268]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:19:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43891 DF PROTO=TCP SPT=49514 DPT=9101 SEQ=3678308428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3D1C70000000001030307) Dec 6 04:19:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:19:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16117 DF PROTO=TCP SPT=56698 DPT=9101 SEQ=4057653698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3DD870000000001030307) Dec 6 04:19:36 localhost python3.9[96466]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:19:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43893 DF PROTO=TCP SPT=49514 DPT=9101 SEQ=3678308428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3E9880000000001030307) Dec 6 04:19:37 localhost python3.9[96556]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:19:38 localhost python3.9[96652]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012# FIXME: perform dnf upgrade for other packages in EDPM ansible#012# here we only ensuring that decontainerized libvirt can start#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:19:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19665 DF PROTO=TCP SPT=50666 DPT=9100 SEQ=3334552316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA3F4C80000000001030307) Dec 6 04:19:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43894 DF PROTO=TCP SPT=49514 DPT=9101 SEQ=3678308428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA409880000000001030307) Dec 6 04:19:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34534 DF PROTO=TCP SPT=35960 DPT=9105 SEQ=85209555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA410070000000001030307) Dec 6 04:19:47 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 6 04:19:47 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 6 04:19:47 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 6 04:19:48 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 6 04:19:48 localhost systemd[1]: Stopping sshd-keygen.target... Dec 6 04:19:48 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 04:19:48 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 04:19:48 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 04:19:48 localhost systemd[1]: Reached target sshd-keygen.target. Dec 6 04:19:48 localhost systemd[1]: Starting OpenSSH server daemon... Dec 6 04:19:48 localhost sshd[96695]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:19:48 localhost systemd[1]: Started OpenSSH server daemon. Dec 6 04:19:48 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 04:19:48 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 04:19:48 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 04:19:48 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 04:19:48 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 04:19:48 localhost systemd[1]: run-rce567f67cecc415a8ef26a9148cdc064.service: Deactivated successfully. Dec 6 04:19:48 localhost systemd[1]: run-r2ea39b5b7bce4149ae02f69f1d8dacb6.service: Deactivated successfully. Dec 6 04:19:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34535 DF PROTO=TCP SPT=35960 DPT=9105 SEQ=85209555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA418080000000001030307) Dec 6 04:19:49 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 6 04:19:49 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 6 04:19:49 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 6 04:19:49 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 6 04:19:49 localhost systemd[1]: Stopping sshd-keygen.target... Dec 6 04:19:49 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 04:19:49 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 04:19:49 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 6 04:19:49 localhost systemd[1]: Reached target sshd-keygen.target. Dec 6 04:19:49 localhost systemd[1]: Starting OpenSSH server daemon... Dec 6 04:19:49 localhost sshd[96875]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:19:49 localhost systemd[1]: Started OpenSSH server daemon. Dec 6 04:19:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34536 DF PROTO=TCP SPT=35960 DPT=9105 SEQ=85209555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA427C70000000001030307) Dec 6 04:19:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38573 DF PROTO=TCP SPT=43360 DPT=9882 SEQ=3219689434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA42F870000000001030307) Dec 6 04:20:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15159 DF PROTO=TCP SPT=58824 DPT=9101 SEQ=3737459720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA442E70000000001030307) Dec 6 04:20:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15160 DF PROTO=TCP SPT=58824 DPT=9101 SEQ=3737459720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA447070000000001030307) Dec 6 04:20:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10864 DF PROTO=TCP SPT=59182 DPT=9101 SEQ=1084263589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA453870000000001030307) Dec 6 04:20:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15162 DF PROTO=TCP SPT=58824 DPT=9101 SEQ=3737459720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA45EC80000000001030307) Dec 6 04:20:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1949 DF PROTO=TCP SPT=55866 DPT=9100 SEQ=3275755856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA46A080000000001030307) Dec 6 04:20:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15163 DF PROTO=TCP SPT=58824 DPT=9101 SEQ=3737459720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA47F870000000001030307) Dec 6 04:20:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36059 DF PROTO=TCP SPT=34436 DPT=9105 SEQ=3776453833 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA485470000000001030307) Dec 6 04:20:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36060 DF PROTO=TCP SPT=34436 DPT=9105 SEQ=3776453833 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA48D470000000001030307) Dec 6 04:20:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36061 DF PROTO=TCP SPT=34436 DPT=9105 SEQ=3776453833 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA49D070000000001030307) Dec 6 04:20:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19941 DF PROTO=TCP SPT=40086 DPT=9102 SEQ=3211207545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4A7880000000001030307) Dec 6 04:20:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59235 DF PROTO=TCP SPT=55146 DPT=9101 SEQ=4094818026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4B8190000000001030307) Dec 6 04:20:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59236 DF PROTO=TCP SPT=55146 DPT=9101 SEQ=4094818026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4BC070000000001030307) Dec 6 04:20:31 localhost sshd[97014]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:20:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1951 DF PROTO=TCP SPT=55866 DPT=9100 SEQ=3275755856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4C9880000000001030307) Dec 6 04:20:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19668 DF PROTO=TCP SPT=50666 DPT=9100 SEQ=3334552316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4D3870000000001030307) Dec 6 04:20:38 localhost sshd[97183]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:20:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55654 DF PROTO=TCP SPT=56600 DPT=9100 SEQ=3153940102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4DF070000000001030307) Dec 6 04:20:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59239 DF PROTO=TCP SPT=55146 DPT=9101 SEQ=4094818026 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4F3870000000001030307) Dec 6 04:20:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25243 DF PROTO=TCP SPT=37578 DPT=9105 SEQ=2743018129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA4FA880000000001030307) Dec 6 04:20:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25244 DF PROTO=TCP SPT=37578 DPT=9105 SEQ=2743018129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA502870000000001030307) Dec 6 04:20:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16665 DF PROTO=TCP SPT=51718 DPT=9882 SEQ=3571639321 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5123E0000000001030307) Dec 6 04:20:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49155 DF PROTO=TCP SPT=33164 DPT=9102 SEQ=4014021329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA51B870000000001030307) Dec 6 04:20:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19942 DF PROTO=TCP SPT=40086 DPT=9102 SEQ=3211207545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA525870000000001030307) Dec 6 04:21:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35554 DF PROTO=TCP SPT=40054 DPT=9101 SEQ=3302099587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA531470000000001030307) Dec 6 04:21:01 localhost kernel: SELinux: Converting 2781 SID table entries... Dec 6 04:21:01 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:21:01 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:21:01 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:21:01 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:21:01 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:21:01 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:21:01 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:21:03 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=18 res=1 Dec 6 04:21:03 localhost python3.9[97467]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:21:04 localhost python3.9[97559]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:21:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15165 DF PROTO=TCP SPT=58824 DPT=9101 SEQ=3737459720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA53D870000000001030307) Dec 6 04:21:04 localhost python3.9[97632]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012863.7511458-426-92987884417527/.source.fact _original_basename=.b_hg9wxr follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:21:05 localhost python3.9[97722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:21:06 localhost python3.9[97820]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:21:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35556 DF PROTO=TCP SPT=40054 DPT=9101 SEQ=3302099587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA549070000000001030307) Dec 6 04:21:07 localhost python3.9[97874]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:21:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36190 DF PROTO=TCP SPT=43920 DPT=9100 SEQ=473513241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA554470000000001030307) Dec 6 04:21:11 localhost systemd[1]: Reloading. Dec 6 04:21:11 localhost systemd-rc-local-generator[97909]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:21:11 localhost systemd-sysv-generator[97913]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:21:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:21:11 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 04:21:13 localhost python3.9[98013]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:21:15 localhost python3.9[98252]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Dec 6 04:21:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35557 DF PROTO=TCP SPT=40054 DPT=9101 SEQ=3302099587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA569870000000001030307) Dec 6 04:21:15 localhost python3.9[98344]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Dec 6 04:21:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55164 DF PROTO=TCP SPT=35066 DPT=9105 SEQ=2503913542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA56FC70000000001030307) Dec 6 04:21:18 localhost python3.9[98437]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:21:19 localhost python3.9[98529]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Dec 6 04:21:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55165 DF PROTO=TCP SPT=35066 DPT=9105 SEQ=2503913542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA577C70000000001030307) Dec 6 04:21:20 localhost python3.9[98621]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:21:21 localhost python3.9[98713]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:21:22 localhost python3.9[98786]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765012881.040779-750-110964369094420/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:21:23 localhost python3.9[98878]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:21:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53783 DF PROTO=TCP SPT=43910 DPT=9882 SEQ=1262949703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5876E0000000001030307) Dec 6 04:21:24 localhost python3.9[98970]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:21:24 localhost python3.9[99063]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:21:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40403 DF PROTO=TCP SPT=47238 DPT=9102 SEQ=363487154 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA591880000000001030307) Dec 6 04:21:25 localhost python3.9[99155]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Dec 6 04:21:26 localhost python3.9[99248]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Dec 6 04:21:28 localhost python3.9[99341]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 6 04:21:28 localhost sssd_nss[37237]: Enumeration requested but not enabled Dec 6 04:21:29 localhost python3.9[99439]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Dec 6 04:21:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48409 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=3588352184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5A2770000000001030307) Dec 6 04:21:30 localhost python3.9[99531]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:21:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48410 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=3588352184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5A6870000000001030307) Dec 6 04:21:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36192 DF PROTO=TCP SPT=43920 DPT=9100 SEQ=473513241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5B5870000000001030307) Dec 6 04:21:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48412 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=3588352184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5BE470000000001030307) Dec 6 04:21:39 localhost python3.9[99625]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:21:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17019 DF PROTO=TCP SPT=50420 DPT=9100 SEQ=1324866777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5C9880000000001030307) Dec 6 04:21:40 localhost python3.9[99717]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:21:41 localhost python3.9[99790]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765012900.2894278-1065-30253036244115/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:21:42 localhost python3.9[99882]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:21:42 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 6 04:21:42 localhost systemd[1]: Stopped Load Kernel Modules. Dec 6 04:21:42 localhost systemd[1]: Stopping Load Kernel Modules... Dec 6 04:21:42 localhost systemd[1]: Starting Load Kernel Modules... Dec 6 04:21:42 localhost systemd-modules-load[99886]: Module 'msr' is built in Dec 6 04:21:42 localhost systemd[1]: Finished Load Kernel Modules. Dec 6 04:21:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48413 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=3588352184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5DF880000000001030307) Dec 6 04:21:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43056 DF PROTO=TCP SPT=40250 DPT=9105 SEQ=776658189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5E4C70000000001030307) Dec 6 04:21:48 localhost python3.9[99978]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:21:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43057 DF PROTO=TCP SPT=40250 DPT=9105 SEQ=776658189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5ECC70000000001030307) Dec 6 04:21:50 localhost python3.9[100051]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765012906.1297019-1134-208758938242527/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:21:51 localhost python3.9[100143]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:21:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43058 DF PROTO=TCP SPT=40250 DPT=9105 SEQ=776658189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA5FC870000000001030307) Dec 6 04:21:55 localhost python3.9[100235]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:21:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41781 DF PROTO=TCP SPT=46310 DPT=9102 SEQ=2333918591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA605870000000001030307) Dec 6 04:21:56 localhost python3.9[100327]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Dec 6 04:21:56 localhost python3.9[100417]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:21:57 localhost python3.9[100509]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:21:58 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Dec 6 04:21:59 localhost systemd[1]: tuned.service: Deactivated successfully. Dec 6 04:21:59 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Dec 6 04:21:59 localhost systemd[1]: tuned.service: Consumed 2.516s CPU time, no IO. Dec 6 04:21:59 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 6 04:22:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12505 DF PROTO=TCP SPT=55204 DPT=9101 SEQ=67395135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA617A70000000001030307) Dec 6 04:22:00 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 6 04:22:00 localhost python3.9[100613]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Dec 6 04:22:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12506 DF PROTO=TCP SPT=55204 DPT=9101 SEQ=67395135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA61BC70000000001030307) Dec 6 04:22:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35559 DF PROTO=TCP SPT=40054 DPT=9101 SEQ=3302099587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA627880000000001030307) Dec 6 04:22:04 localhost python3.9[100705]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:22:04 localhost systemd[1]: Reloading. Dec 6 04:22:05 localhost systemd-rc-local-generator[100731]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:22:05 localhost systemd-sysv-generator[100737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:22:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:22:05 localhost python3.9[100836]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:22:05 localhost systemd[1]: Reloading. Dec 6 04:22:06 localhost systemd-rc-local-generator[100863]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:22:06 localhost systemd-sysv-generator[100868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:22:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:22:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36193 DF PROTO=TCP SPT=43920 DPT=9100 SEQ=473513241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA633870000000001030307) Dec 6 04:22:08 localhost python3.9[100965]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:22:08 localhost python3.9[101058]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:22:08 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Dec 6 04:22:09 localhost python3.9[101151]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:22:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48000 DF PROTO=TCP SPT=58488 DPT=9100 SEQ=3030197993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA63EC70000000001030307) Dec 6 04:22:11 localhost python3.9[101250]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:22:11 localhost python3.9[101343]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:22:11 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 6 04:22:11 localhost systemd[1]: Stopped Apply Kernel Variables. Dec 6 04:22:11 localhost systemd[1]: Stopping Apply Kernel Variables... Dec 6 04:22:11 localhost systemd[1]: Starting Apply Kernel Variables... Dec 6 04:22:11 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 6 04:22:11 localhost systemd[1]: Finished Apply Kernel Variables. Dec 6 04:22:12 localhost systemd-logind[760]: Session 25 logged out. Waiting for processes to exit. Dec 6 04:22:12 localhost systemd[1]: session-25.scope: Deactivated successfully. Dec 6 04:22:12 localhost systemd[1]: session-25.scope: Consumed 1min 56.913s CPU time. Dec 6 04:22:12 localhost systemd-logind[760]: Removed session 25. Dec 6 04:22:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12509 DF PROTO=TCP SPT=55204 DPT=9101 SEQ=67395135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA653880000000001030307) Dec 6 04:22:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56476 DF PROTO=TCP SPT=52520 DPT=9105 SEQ=8264222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA65A070000000001030307) Dec 6 04:22:17 localhost sshd[101363]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:22:17 localhost systemd-logind[760]: New session 26 of user zuul. Dec 6 04:22:17 localhost systemd[1]: Started Session 26 of User zuul. Dec 6 04:22:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56477 DF PROTO=TCP SPT=52520 DPT=9105 SEQ=8264222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA662070000000001030307) Dec 6 04:22:19 localhost python3.9[101456]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:22:21 localhost python3.9[101550]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:22:22 localhost python3.9[101646]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:22:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56478 DF PROTO=TCP SPT=52520 DPT=9105 SEQ=8264222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA671C70000000001030307) Dec 6 04:22:23 localhost python3.9[101737]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:22:24 localhost python3.9[101833]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:22:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21896 DF PROTO=TCP SPT=60706 DPT=9882 SEQ=267830220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA679870000000001030307) Dec 6 04:22:25 localhost python3.9[101887]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:22:29 localhost python3.9[101981]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:22:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28334 DF PROTO=TCP SPT=41174 DPT=9101 SEQ=33804705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA68CD70000000001030307) Dec 6 04:22:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28335 DF PROTO=TCP SPT=41174 DPT=9101 SEQ=33804705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA690C70000000001030307) Dec 6 04:22:31 localhost python3.9[102136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:22:32 localhost python3.9[102228]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:22:33 localhost python3.9[102333]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:22:33 localhost python3.9[102381]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:22:34 localhost python3.9[102473]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:22:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48415 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=3588352184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA69D870000000001030307) Dec 6 04:22:35 localhost python3.9[102546]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765012953.8748734-323-98462032785543/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:22:35 localhost python3.9[102638]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:22:36 localhost python3.9[102730]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:22:37 localhost python3.9[102822]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:22:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28337 DF PROTO=TCP SPT=41174 DPT=9101 SEQ=33804705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA6A8880000000001030307) Dec 6 04:22:37 localhost python3.9[102914]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:22:38 localhost python3.9[103004]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:22:39 localhost sshd[103053]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:22:39 localhost python3.9[103100]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:22:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7720 DF PROTO=TCP SPT=56200 DPT=9100 SEQ=1619934372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA6B4080000000001030307) Dec 6 04:22:43 localhost python3.9[103194]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:22:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28338 DF PROTO=TCP SPT=41174 DPT=9101 SEQ=33804705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA6C9870000000001030307) Dec 6 04:22:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1862 DF PROTO=TCP SPT=48316 DPT=9105 SEQ=1794358594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA6CF480000000001030307) Dec 6 04:22:48 localhost python3.9[103288]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:22:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1863 DF PROTO=TCP SPT=48316 DPT=9105 SEQ=1794358594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA6D7470000000001030307) Dec 6 04:22:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28526 DF PROTO=TCP SPT=54920 DPT=9882 SEQ=2839076572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA6E6FF0000000001030307) Dec 6 04:22:53 localhost python3.9[103388]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:22:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61299 DF PROTO=TCP SPT=40846 DPT=9102 SEQ=2213153895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA6F1870000000001030307) Dec 6 04:22:57 localhost python3.9[103482]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:23:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58047 DF PROTO=TCP SPT=57914 DPT=9101 SEQ=3689188460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA702080000000001030307) Dec 6 04:23:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58048 DF PROTO=TCP SPT=57914 DPT=9101 SEQ=3689188460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA706070000000001030307) Dec 6 04:23:01 localhost python3.9[103576]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:23:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12511 DF PROTO=TCP SPT=55204 DPT=9101 SEQ=67395135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA711870000000001030307) Dec 6 04:23:05 localhost python3.9[103670]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:23:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48003 DF PROTO=TCP SPT=58488 DPT=9100 SEQ=3030197993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA71D870000000001030307) Dec 6 04:23:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1427 DF PROTO=TCP SPT=43898 DPT=9100 SEQ=836215499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA729070000000001030307) Dec 6 04:23:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58051 DF PROTO=TCP SPT=57914 DPT=9101 SEQ=3689188460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA73D870000000001030307) Dec 6 04:23:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43605 DF PROTO=TCP SPT=56808 DPT=9105 SEQ=1198430657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA744870000000001030307) Dec 6 04:23:17 localhost python3.9[103833]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:23:18 localhost python3.9[103938]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:23:18 localhost python3.9[104011]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765012997.8040838-722-14780913773633/.source.json _original_basename=.xafhoieu follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:23:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43606 DF PROTO=TCP SPT=56808 DPT=9105 SEQ=1198430657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA74C870000000001030307) Dec 6 04:23:19 localhost python3.9[104103]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 6 04:23:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26586 DF PROTO=TCP SPT=47542 DPT=9882 SEQ=456613475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA75C2E0000000001030307) Dec 6 04:23:25 localhost podman[104115]: 2025-12-06 09:23:19.893976229 +0000 UTC m=+0.044752263 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Dec 6 04:23:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39364 DF PROTO=TCP SPT=37600 DPT=9102 SEQ=825250425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA765870000000001030307) Dec 6 04:23:27 localhost python3.9[104314]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 6 04:23:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61300 DF PROTO=TCP SPT=40846 DPT=9102 SEQ=2213153895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA76F870000000001030307) Dec 6 04:23:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23052 DF PROTO=TCP SPT=50660 DPT=9101 SEQ=1887167256 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA77B470000000001030307) Dec 6 04:23:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28340 DF PROTO=TCP SPT=41174 DPT=9101 SEQ=33804705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA787880000000001030307) Dec 6 04:23:35 localhost podman[104327]: 2025-12-06 09:23:27.726914714 +0000 UTC m=+0.044416863 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 6 04:23:36 localhost python3.9[104527]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 6 04:23:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23054 DF PROTO=TCP SPT=50660 DPT=9101 SEQ=1887167256 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA793070000000001030307) Dec 6 04:23:38 localhost podman[104539]: 2025-12-06 09:23:36.778334855 +0000 UTC m=+0.046802286 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Dec 6 04:23:39 localhost python3.9[104702]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 6 04:23:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39957 DF PROTO=TCP SPT=58814 DPT=9100 SEQ=4226098164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA79E470000000001030307) Dec 6 04:23:40 localhost podman[104716]: 2025-12-06 09:23:39.469812075 +0000 UTC m=+0.027472259 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 04:23:41 localhost python3.9[104880]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 6 04:23:44 localhost podman[104893]: 2025-12-06 09:23:41.748962526 +0000 UTC m=+0.043133072 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Dec 6 04:23:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23055 DF PROTO=TCP SPT=50660 DPT=9101 SEQ=1887167256 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7B3870000000001030307) Dec 6 04:23:45 localhost python3.9[105071]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 6 04:23:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63189 DF PROTO=TCP SPT=41670 DPT=9105 SEQ=3283826520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7B9870000000001030307) Dec 6 04:23:47 localhost podman[105083]: 2025-12-06 09:23:45.828124076 +0000 UTC m=+0.047208509 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Dec 6 04:23:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63190 DF PROTO=TCP SPT=41670 DPT=9105 SEQ=3283826520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7C1870000000001030307) Dec 6 04:23:49 localhost systemd[1]: session-26.scope: Deactivated successfully. Dec 6 04:23:49 localhost systemd[1]: session-26.scope: Consumed 1min 28.593s CPU time. Dec 6 04:23:49 localhost systemd-logind[760]: Session 26 logged out. Waiting for processes to exit. Dec 6 04:23:49 localhost systemd-logind[760]: Removed session 26. Dec 6 04:23:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63191 DF PROTO=TCP SPT=41670 DPT=9105 SEQ=3283826520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7D1470000000001030307) Dec 6 04:23:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4285 DF PROTO=TCP SPT=58134 DPT=9102 SEQ=3771617304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7DB870000000001030307) Dec 6 04:23:56 localhost sshd[105192]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:23:56 localhost systemd-logind[760]: New session 27 of user zuul. Dec 6 04:23:56 localhost systemd[1]: Started Session 27 of User zuul. Dec 6 04:23:58 localhost python3.9[105285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:23:59 localhost python3.9[105484]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Dec 6 04:24:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39125 DF PROTO=TCP SPT=46546 DPT=9101 SEQ=3154189872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7EC680000000001030307) Dec 6 04:24:00 localhost python3.9[105577]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:24:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39126 DF PROTO=TCP SPT=46546 DPT=9101 SEQ=3154189872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7F0880000000001030307) Dec 6 04:24:01 localhost python3.9[105631]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:24:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39959 DF PROTO=TCP SPT=58814 DPT=9100 SEQ=4226098164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA7FF880000000001030307) Dec 6 04:24:06 localhost python3.9[105980]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:24:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39128 DF PROTO=TCP SPT=46546 DPT=9101 SEQ=3154189872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA808470000000001030307) Dec 6 04:24:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50161 DF PROTO=TCP SPT=39726 DPT=9100 SEQ=405128462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA813880000000001030307) Dec 6 04:24:11 localhost python3.9[106131]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:24:13 localhost python3.9[106224]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:24:14 localhost python3.9[106316]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Dec 6 04:24:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39129 DF PROTO=TCP SPT=46546 DPT=9101 SEQ=3154189872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA829880000000001030307) Dec 6 04:24:16 localhost kernel: SELinux: Converting 2785 SID table entries... Dec 6 04:24:16 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:24:16 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:24:16 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:24:16 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:24:16 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:24:16 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:24:16 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:24:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24443 DF PROTO=TCP SPT=57892 DPT=9105 SEQ=3878174081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA82EC70000000001030307) Dec 6 04:24:17 localhost python3.9[106740]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:24:18 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=19 res=1 Dec 6 04:24:18 localhost python3.9[106838]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:24:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24444 DF PROTO=TCP SPT=57892 DPT=9105 SEQ=3878174081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA836C70000000001030307) Dec 6 04:24:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24445 DF PROTO=TCP SPT=57892 DPT=9105 SEQ=3878174081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA846870000000001030307) Dec 6 04:24:23 localhost python3.9[106932]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:24:25 localhost python3.9[107177]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 6 04:24:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8105 DF PROTO=TCP SPT=60802 DPT=9102 SEQ=1954253523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA84F870000000001030307) Dec 6 04:24:26 localhost python3.9[107267]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:24:27 localhost python3.9[107361]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:24:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49351 DF PROTO=TCP SPT=49924 DPT=9101 SEQ=547794554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA861970000000001030307) Dec 6 04:24:30 localhost python3.9[107455]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:24:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49352 DF PROTO=TCP SPT=49924 DPT=9101 SEQ=547794554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA865870000000001030307) Dec 6 04:24:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23057 DF PROTO=TCP SPT=50660 DPT=9101 SEQ=1887167256 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA871880000000001030307) Dec 6 04:24:34 localhost python3.9[107549]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 6 04:24:34 localhost systemd[1]: Reloading. Dec 6 04:24:34 localhost sshd[107555]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:24:34 localhost systemd-rc-local-generator[107578]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:24:34 localhost systemd-sysv-generator[107583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:24:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:24:35 localhost python3.9[107683]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:24:36 localhost python3.9[107775]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49354 DF PROTO=TCP SPT=49924 DPT=9101 SEQ=547794554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA87D470000000001030307) Dec 6 04:24:37 localhost python3.9[107869]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:38 localhost python3.9[107961]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:39 localhost python3.9[108053]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:24:39 localhost python3.9[108126]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013078.653776-563-65642013214954/.source _original_basename=.g_4f7vvd follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18922 DF PROTO=TCP SPT=38966 DPT=9100 SEQ=3168201046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA888C80000000001030307) Dec 6 04:24:41 localhost python3.9[108218]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:41 localhost python3.9[108310]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Dec 6 04:24:42 localhost python3.9[108402]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:43 localhost python3.9[108494]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:24:44 localhost python3.9[108567]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013083.160867-689-210204498398235/.source.yaml _original_basename=.98hvcnij follow=False checksum=462d7af485ea6b2a48391ea9b1e5d3d6aa88a07d force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:45 localhost python3.9[108659]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Dec 6 04:24:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49355 DF PROTO=TCP SPT=49924 DPT=9101 SEQ=547794554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA89D880000000001030307) Dec 6 04:24:46 localhost ansible-async_wrapper.py[108764]: Invoked with j156268852525 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013085.8713858-761-42183657467230/AnsiballZ_edpm_os_net_config.py _ Dec 6 04:24:46 localhost ansible-async_wrapper.py[108767]: Starting module and watcher Dec 6 04:24:46 localhost ansible-async_wrapper.py[108767]: Start watching 108768 (300) Dec 6 04:24:46 localhost ansible-async_wrapper.py[108768]: Start module (108768) Dec 6 04:24:46 localhost ansible-async_wrapper.py[108764]: Return async_wrapper task started. Dec 6 04:24:46 localhost python3.9[108769]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False Dec 6 04:24:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56264 DF PROTO=TCP SPT=60078 DPT=9105 SEQ=3846442780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8A4070000000001030307) Dec 6 04:24:47 localhost ansible-async_wrapper.py[108768]: Module complete (108768) Dec 6 04:24:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56265 DF PROTO=TCP SPT=60078 DPT=9105 SEQ=3846442780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8AC070000000001030307) Dec 6 04:24:51 localhost python3.9[108861]: ansible-ansible.legacy.async_status Invoked with jid=j156268852525.108764 mode=status _async_dir=/root/.ansible_async Dec 6 04:24:51 localhost python3.9[108920]: ansible-ansible.legacy.async_status Invoked with jid=j156268852525.108764 mode=cleanup _async_dir=/root/.ansible_async Dec 6 04:24:51 localhost ansible-async_wrapper.py[108767]: Done in kid B. Dec 6 04:24:52 localhost python3.9[109012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:24:53 localhost python3.9[109085]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013091.7442873-827-273424394602599/.source.returncode _original_basename=.u2vb_eeh follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12896 DF PROTO=TCP SPT=35556 DPT=9882 SEQ=2080570044 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8BBBF0000000001030307) Dec 6 04:24:54 localhost python3.9[109177]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:24:54 localhost python3.9[109250]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013093.6093526-875-200279873551010/.source.cfg _original_basename=.rwt7d060 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:24:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15006 DF PROTO=TCP SPT=35690 DPT=9882 SEQ=2033637050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8C3870000000001030307) Dec 6 04:24:55 localhost python3.9[109342]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:24:55 localhost systemd[1]: Reloading Network Manager... Dec 6 04:24:55 localhost NetworkManager[5965]: [1765013095.6113] audit: op="reload" arg="0" pid=109346 uid=0 result="success" Dec 6 04:24:55 localhost NetworkManager[5965]: [1765013095.6118] config: signal: SIGHUP (no changes from disk) Dec 6 04:24:55 localhost systemd[1]: Reloaded Network Manager. Dec 6 04:24:56 localhost systemd[1]: session-27.scope: Deactivated successfully. Dec 6 04:24:56 localhost systemd[1]: session-27.scope: Consumed 35.127s CPU time. Dec 6 04:24:56 localhost systemd-logind[760]: Session 27 logged out. Waiting for processes to exit. Dec 6 04:24:56 localhost systemd-logind[760]: Removed session 27. Dec 6 04:25:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15010 DF PROTO=TCP SPT=56936 DPT=9101 SEQ=3808905510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8D6C70000000001030307) Dec 6 04:25:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15011 DF PROTO=TCP SPT=56936 DPT=9101 SEQ=3808905510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8DAC70000000001030307) Dec 6 04:25:01 localhost sshd[109361]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:25:01 localhost systemd-logind[760]: New session 28 of user zuul. Dec 6 04:25:01 localhost systemd[1]: Started Session 28 of User zuul. Dec 6 04:25:02 localhost python3.9[109454]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:25:03 localhost python3.9[109548]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:25:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39131 DF PROTO=TCP SPT=46546 DPT=9101 SEQ=3154189872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8E7870000000001030307) Dec 6 04:25:05 localhost python3.9[109701]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:25:05 localhost systemd-logind[760]: Session 28 logged out. Waiting for processes to exit. Dec 6 04:25:05 localhost systemd[1]: session-28.scope: Deactivated successfully. Dec 6 04:25:05 localhost systemd[1]: session-28.scope: Consumed 2.303s CPU time. Dec 6 04:25:05 localhost systemd-logind[760]: Removed session 28. Dec 6 04:25:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15013 DF PROTO=TCP SPT=56936 DPT=9101 SEQ=3808905510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8F2870000000001030307) Dec 6 04:25:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30438 DF PROTO=TCP SPT=42826 DPT=9100 SEQ=695863783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA8FDC80000000001030307) Dec 6 04:25:12 localhost sshd[109717]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:25:13 localhost systemd-logind[760]: New session 29 of user zuul. Dec 6 04:25:13 localhost systemd[1]: Started Session 29 of User zuul. Dec 6 04:25:14 localhost python3.9[109810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:25:15 localhost python3.9[109904]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:25:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15014 DF PROTO=TCP SPT=56936 DPT=9101 SEQ=3808905510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA913870000000001030307) Dec 6 04:25:16 localhost python3.9[110000]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:25:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27626 DF PROTO=TCP SPT=40828 DPT=9105 SEQ=438402349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA919480000000001030307) Dec 6 04:25:17 localhost python3.9[110054]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:25:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27627 DF PROTO=TCP SPT=40828 DPT=9105 SEQ=438402349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA921470000000001030307) Dec 6 04:25:21 localhost python3.9[110148]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:25:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12284 DF PROTO=TCP SPT=57626 DPT=9882 SEQ=659913468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA930EE0000000001030307) Dec 6 04:25:23 localhost python3.9[110303]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:25:24 localhost python3.9[110395]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:25:25 localhost python3.9[110499]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:25:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37179 DF PROTO=TCP SPT=54462 DPT=9102 SEQ=1904400315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA93B870000000001030307) Dec 6 04:25:26 localhost python3.9[110547]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:25:26 localhost python3.9[110639]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:25:27 localhost python3.9[110687]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:25:28 localhost python3.9[110779]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:25:28 localhost python3.9[110871]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:25:29 localhost python3.9[110963]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:25:29 localhost python3.9[111055]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 6 04:25:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51790 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=186681722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA94BF70000000001030307) Dec 6 04:25:30 localhost python3.9[111147]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:25:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51791 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=186681722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA950070000000001030307) Dec 6 04:25:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49357 DF PROTO=TCP SPT=49924 DPT=9101 SEQ=547794554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA95B870000000001030307) Dec 6 04:25:34 localhost python3.9[111241]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:25:35 localhost python3.9[111335]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:25:36 localhost python3.9[111427]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:25:37 localhost python3.9[111519]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:25:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18925 DF PROTO=TCP SPT=38966 DPT=9100 SEQ=3168201046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA967870000000001030307) Dec 6 04:25:38 localhost python3.9[111612]: ansible-service_facts Invoked Dec 6 04:25:38 localhost network[111629]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:25:38 localhost network[111630]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:25:38 localhost network[111631]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:25:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16509 DF PROTO=TCP SPT=46978 DPT=9100 SEQ=2628963053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA973070000000001030307) Dec 6 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:25:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51794 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=186681722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA987870000000001030307) Dec 6 04:25:46 localhost python3.9[111953]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:25:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30691 DF PROTO=TCP SPT=57602 DPT=9105 SEQ=3137570316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA98E470000000001030307) Dec 6 04:25:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30692 DF PROTO=TCP SPT=57602 DPT=9105 SEQ=3137570316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA996470000000001030307) Dec 6 04:25:51 localhost python3.9[112047]: ansible-package_facts Invoked with manager=['auto'] strategy=first Dec 6 04:25:53 localhost python3.9[112139]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:25:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30693 DF PROTO=TCP SPT=57602 DPT=9105 SEQ=3137570316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9A6070000000001030307) Dec 6 04:25:53 localhost python3.9[112214]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013152.5547383-656-198672240497073/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:25:55 localhost python3.9[112308]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:25:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12289 DF PROTO=TCP SPT=57626 DPT=9882 SEQ=659913468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9AD880000000001030307) Dec 6 04:25:55 localhost python3.9[112383]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013154.7091677-701-102172822552709/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:25:57 localhost python3.9[112477]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:25:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37180 DF PROTO=TCP SPT=54462 DPT=9102 SEQ=1904400315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9B9870000000001030307) Dec 6 04:25:59 localhost python3.9[112571]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:26:00 localhost python3.9[112625]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:26:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33030 DF PROTO=TCP SPT=48652 DPT=9101 SEQ=2660784305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9C5470000000001030307) Dec 6 04:26:01 localhost python3.9[112719]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:26:02 localhost python3.9[112773]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:26:02 localhost systemd[1]: Stopping NTP client/server... Dec 6 04:26:02 localhost chronyd[37018]: chronyd exiting Dec 6 04:26:02 localhost systemd[1]: chronyd.service: Deactivated successfully. Dec 6 04:26:02 localhost systemd[1]: Stopped NTP client/server. Dec 6 04:26:02 localhost systemd[1]: Starting NTP client/server... Dec 6 04:26:02 localhost chronyd[112781]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 6 04:26:02 localhost chronyd[112781]: Frequency -30.536 +/- 0.288 ppm read from /var/lib/chrony/drift Dec 6 04:26:02 localhost chronyd[112781]: Loaded seccomp filter (level 2) Dec 6 04:26:02 localhost systemd[1]: Started NTP client/server. Dec 6 04:26:03 localhost systemd-logind[760]: Session 29 logged out. Waiting for processes to exit. Dec 6 04:26:03 localhost systemd[1]: session-29.scope: Deactivated successfully. Dec 6 04:26:03 localhost systemd[1]: session-29.scope: Consumed 28.262s CPU time. Dec 6 04:26:03 localhost systemd-logind[760]: Removed session 29. Dec 6 04:26:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15016 DF PROTO=TCP SPT=56936 DPT=9101 SEQ=3808905510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9D1870000000001030307) Dec 6 04:26:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33032 DF PROTO=TCP SPT=48652 DPT=9101 SEQ=2660784305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9DD080000000001030307) Dec 6 04:26:09 localhost sshd[112797]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:26:09 localhost systemd-logind[760]: New session 30 of user zuul. Dec 6 04:26:09 localhost systemd[1]: Started Session 30 of User zuul. Dec 6 04:26:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11463 DF PROTO=TCP SPT=43588 DPT=9100 SEQ=492871193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9E8480000000001030307) Dec 6 04:26:10 localhost python3.9[112890]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:26:12 localhost python3.9[112986]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:12 localhost python3.9[113091]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:13 localhost python3.9[113139]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.9_jy2evz recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:14 localhost python3.9[113231]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:15 localhost python3.9[113306]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013173.9451358-143-57566518039444/.source _original_basename=.rogb5h9f follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33033 DF PROTO=TCP SPT=48652 DPT=9101 SEQ=2660784305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9FD870000000001030307) Dec 6 04:26:15 localhost python3.9[113398]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:26:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9133 DF PROTO=TCP SPT=35812 DPT=9105 SEQ=3414218185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DA9FF880000000001030307) Dec 6 04:26:16 localhost python3.9[113490]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:17 localhost python3.9[113563]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013176.1123154-215-173279750290475/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:26:18 localhost python3.9[113655]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:18 localhost python3.9[113728]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013177.212091-215-93963168181378/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:26:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9135 DF PROTO=TCP SPT=35812 DPT=9105 SEQ=3414218185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA0B870000000001030307) Dec 6 04:26:19 localhost python3.9[113820]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:20 localhost python3.9[113912]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:20 localhost python3.9[113985]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013179.7953095-326-188281586132342/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:21 localhost python3.9[114077]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:21 localhost python3.9[114150]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013180.9548144-371-77917283144577/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:23 localhost python3.9[114242]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:26:23 localhost systemd[1]: Reloading. Dec 6 04:26:23 localhost systemd-rc-local-generator[114263]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:26:23 localhost systemd-sysv-generator[114266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:26:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:26:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9136 DF PROTO=TCP SPT=35812 DPT=9105 SEQ=3414218185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA1B470000000001030307) Dec 6 04:26:23 localhost systemd[1]: Reloading. Dec 6 04:26:23 localhost systemd-rc-local-generator[114302]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:26:23 localhost systemd-sysv-generator[114307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:26:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:26:23 localhost systemd[1]: Starting EDPM Container Shutdown... Dec 6 04:26:23 localhost systemd[1]: Finished EDPM Container Shutdown. Dec 6 04:26:24 localhost python3.9[114410]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:24 localhost python3.9[114483]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013183.867372-440-106746209566252/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:25 localhost python3.9[114575]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3971 DF PROTO=TCP SPT=34496 DPT=9102 SEQ=1048410104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA25880000000001030307) Dec 6 04:26:26 localhost python3.9[114648]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013185.0217507-485-234461364438223/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:27 localhost python3.9[114740]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:26:27 localhost systemd[1]: Reloading. Dec 6 04:26:27 localhost systemd-rc-local-generator[114767]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:26:27 localhost systemd-sysv-generator[114770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:26:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:26:27 localhost systemd[1]: Starting Create netns directory... Dec 6 04:26:27 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 04:26:27 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 04:26:27 localhost systemd[1]: Finished Create netns directory. Dec 6 04:26:29 localhost python3.9[114872]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:26:29 localhost network[114889]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:26:29 localhost network[114890]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:26:29 localhost network[114891]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:26:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22178 DF PROTO=TCP SPT=35242 DPT=9101 SEQ=1997555236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA36580000000001030307) Dec 6 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:26:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22179 DF PROTO=TCP SPT=35242 DPT=9101 SEQ=1997555236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA3A470000000001030307) Dec 6 04:26:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11465 DF PROTO=TCP SPT=43588 DPT=9100 SEQ=492871193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA49870000000001030307) Dec 6 04:26:37 localhost python3.9[115093]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22181 DF PROTO=TCP SPT=35242 DPT=9101 SEQ=1997555236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA52070000000001030307) Dec 6 04:26:38 localhost sshd[115150]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:26:38 localhost python3.9[115170]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013196.721036-608-114730076715925/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:39 localhost python3.9[115263]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:26:39 localhost systemd[1]: Reloading OpenSSH server daemon... Dec 6 04:26:39 localhost systemd[1]: Reloaded OpenSSH server daemon. Dec 6 04:26:39 localhost sshd[96875]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:26:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21039 DF PROTO=TCP SPT=55972 DPT=9100 SEQ=1277709320 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA5D470000000001030307) Dec 6 04:26:40 localhost python3.9[115359]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:40 localhost python3.9[115451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:41 localhost python3.9[115524]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013200.4192414-701-272870463915854/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:42 localhost python3.9[115616]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Dec 6 04:26:42 localhost systemd[1]: Starting Time & Date Service... Dec 6 04:26:42 localhost systemd[1]: Started Time & Date Service. Dec 6 04:26:43 localhost python3.9[115712]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:43 localhost python3.9[115804]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:44 localhost python3.9[115877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013203.4477537-806-15074206671859/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:45 localhost python3.9[115969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22182 DF PROTO=TCP SPT=35242 DPT=9101 SEQ=1997555236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA71880000000001030307) Dec 6 04:26:45 localhost python3.9[116042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013204.6602523-851-4080751160717/.source.yaml _original_basename=.ct_o8zt0 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:46 localhost python3.9[116134]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:47 localhost python3.9[116209]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013206.0328603-896-210111743560005/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37484 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=3995697440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA78C70000000001030307) Dec 6 04:26:47 localhost python3.9[116301]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:26:48 localhost python3.9[116394]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:26:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37485 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=3995697440 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA80C70000000001030307) Dec 6 04:26:49 localhost python3[116487]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Dec 6 04:26:51 localhost python3.9[116579]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:52 localhost python3.9[116652]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013210.062452-1013-215339153733471/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:52 localhost python3.9[116744]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:53 localhost python3.9[116817]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013212.2560964-1058-81776580468900/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27936 DF PROTO=TCP SPT=60220 DPT=9882 SEQ=3265270128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA907E0000000001030307) Dec 6 04:26:54 localhost python3.9[116909]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:54 localhost python3.9[116982]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013213.5272815-1103-18867467728972/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:55 localhost python3.9[117074]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30426 DF PROTO=TCP SPT=53216 DPT=9102 SEQ=685637698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAA99870000000001030307) Dec 6 04:26:55 localhost python3.9[117147]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013214.7119331-1148-85728747127876/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:56 localhost python3.9[117239]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:26:56 localhost python3.9[117312]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013215.8531842-1193-251397605331454/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:57 localhost python3.9[117404]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:58 localhost python3.9[117496]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:26:58 localhost python3.9[117591]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:26:58 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Dec 6 04:26:58 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 04:26:58 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:26:58 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:26:59 localhost python3.9[117685]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:00 localhost python3.9[117777]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13966 DF PROTO=TCP SPT=51858 DPT=9101 SEQ=4059185293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAAAB870000000001030307) Dec 6 04:27:00 localhost python3.9[117869]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Dec 6 04:27:01 localhost python3.9[117962]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Dec 6 04:27:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22183 DF PROTO=TCP SPT=35242 DPT=9101 SEQ=1997555236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAAB1880000000001030307) Dec 6 04:27:02 localhost systemd[1]: session-30.scope: Deactivated successfully. Dec 6 04:27:02 localhost systemd[1]: session-30.scope: Consumed 27.938s CPU time. Dec 6 04:27:02 localhost systemd-logind[760]: Session 30 logged out. Waiting for processes to exit. Dec 6 04:27:02 localhost systemd-logind[760]: Removed session 30. Dec 6 04:27:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33035 DF PROTO=TCP SPT=48652 DPT=9101 SEQ=2660784305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAABB870000000001030307) Dec 6 04:27:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11466 DF PROTO=TCP SPT=43588 DPT=9100 SEQ=492871193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAAC7880000000001030307) Dec 6 04:27:08 localhost sshd[117978]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:27:08 localhost systemd-logind[760]: New session 31 of user zuul. Dec 6 04:27:08 localhost systemd[1]: Started Session 31 of User zuul. Dec 6 04:27:09 localhost python3.9[118073]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Dec 6 04:27:10 localhost python3.9[118165]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:27:12 localhost python3.9[118259]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Dec 6 04:27:12 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Dec 6 04:27:13 localhost python3.9[118353]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.znvcoyru follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:27:13 localhost python3.9[118428]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.znvcoyru mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013232.764366-189-274206774142258/.source.znvcoyru _original_basename=.erqrt18t follow=False checksum=0182211fb6a89f854ea16b14eb4c8a6fc324efe7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:16 localhost python3.9[118520]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:27:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33861 DF PROTO=TCP SPT=59082 DPT=9105 SEQ=3624415597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAAE9E50000000001030307) Dec 6 04:27:18 localhost python3.9[118612]: ansible-ansible.builtin.blockinfile Invoked with block=np0005548801.ooo.test,192.168.122.108,np0005548801* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDd+aRSgSo36R7PqjzbUFG6PyAV9yxnChvJCvJybGQ0fSqYn6bKlFCiIHXzHKD8SdMAkgHq/v0qRoKF+rPm6k1lIEhMqRCIbIKr7X1ExClL3LRXDU6nZNh1OPKSeyyXdQ4+dbtPwOE9fRbFMmlhp22/WNvpqnFBAHy7ytb5ie8BEEeVsrh0JHPQfx/59QNWSiJTOIi9yb9XyVG8f8C6AfXKSfGVbltlGVWboeIefKKq1fUdRTuQ8CVeyF76G9zniI+2HG6xoark7XcV5VjS3PyloP9UXYrrU57aBpOoM4AmFgMuEnk1x+B1BLODxV6ZAh4/NpO3XTjqnLFCYZxljuPGAB3TPO3/E93qFOYGadTSDlHpaP/eYtknlUJuDK4iGwRTz36NcmRLROhSUSrGb/QI61dDtTwHHk8RyLKqgMZhWVN7CXYYb38/HnMHnWgMiHXSc/xaVuYgAhrBn5cO5losUc3ZZhmgvF1hN4idTmgQ57+wm2VQzLKjyrQmOAOZ+iU=#012np0005548801.ooo.test,192.168.122.108,np0005548801* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIM7jH7Kv/s3kIo2lIAkCKjmbIRo4pcArz4VurGnBBsTR#012np0005548801.ooo.test,192.168.122.108,np0005548801* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEm61zN6svXhcKvex39rkb2XxkXWl0CjSv4irSeGSbdjA/AGsrFxvArozK/1S4UxdqlVcO4wr3xPoVY72/BFvS4=#012np0005548798.ooo.test,192.168.122.106,np0005548798* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUJonDoETSicDbuGTq9YYOlM9+JMNMlppTltMCPPFSDxsaxfHMWFXbVc3BguCERMxx4AM5SbDvo6UsBdnLmfZN7cTtEzeMReJ2/+qBvX57cAqBKD4L/03GSJnTS10UNl3oJJAXN8T2uYrA6PZiUIBctay1auslD+tEj8tx8v4GMF0ZGtRug2hdcFvvV3n3uLoejSA0/wcBeDkDL3ZMfkdKMtC6jZOYuv9O+4tgWcNqb6+NnKvrL4qmDjTff8s+PhsoMwLqAhLIMwZ0/Xe6k5VMIAormSUzoZ+oiV6+Z9G56Ju9sDty+beWeTwRlxg7ZVCfN5OQTd9vK55AfNHGE336g0cvzjVgrPy3JICFQZcuHJxzJMdfVFzaP5nT1aAia0JqFs+rwoXlEH+P7d/n+LotsslEhTUG6gDudBH7qTDhlwU1jKVPrmVYwb+qJ0VipSspEAfOQKO6fGheajO8C+I8lQtvXxUtY308i6Yvwhu+p2S8q6qjeIUaKyDi6JdnK5s=#012np0005548798.ooo.test,192.168.122.106,np0005548798* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHDRwlKt0hdP5MTDsxzRQesRd4x5FalDOf2BZ54EmR+E#012np0005548798.ooo.test,192.168.122.106,np0005548798* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ2ADTAmWP1ddsdgTbD59tjw4/UtXdlH60DItJCUsmQJqdXW3sy2Nraymru53i3G3oIx4Du4vemth5M2m4d/7qI=#012np0005548796.ooo.test,192.168.122.104,np0005548796* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDP/HZnhHGeahVyDtfOIsLMqJDkZJEo1/EuutMSa4wznu9eEe/zWgWZLNcawK29WJO7GczQI0pDvsi/VrFfOXwuhnY3nwqoWaOnABNpYRc2NsAZ234FpJWCUOI05Jds6Z4v3086SYzyhXQIfayAiVujKg9WBsUNql9QfBX7XUtdG8LUz0u2z2cHpyK1Md4YA69eTdiUb8zqGDM0vpah67KtFr6AT0ZLGCkED32lXMIlLCFdYQs0Rx+76zigjilN0qiJTBu+7uiDEpX0Oux9lXXKbIBA4NHxT2tj9NPSemTYb0yftsEnJvpHL6k9T/Ss7b3pM8khdKuXz/mUGTy8jhBrmNBMmoFYQjPdIAMr8sxNRXFklju3b1s9OXjmPkPII4kTj0vVbGnEsbfuu6K0K34ytAgwK3w9PH1ByrUyMMmRiFK1NtmJoh33TMtNZ5FJfe2bXq8VyRgA91P1DwrO42ycJFYXD1+YLEtNebeCWtxdVNqQ8FdUvccAAUKU848DbmM=#012np0005548796.ooo.test,192.168.122.104,np0005548796* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCJPnxnLxkhFbQRD0Qg/D7gsNO/CV2VV+8Ib+t3sxlY#012np0005548796.ooo.test,192.168.122.104,np0005548796* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMylSzguu4DIEqGzAIVekUtj7Qgu5lY2DbTECWkfFOfFCYSIav4JG5alPCzQZbWyZw0vOChGABFsviJn+G0uZtI=#012np0005548799.ooo.test,192.168.122.107,np0005548799* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwYtRu2haMMHw4cIuwtRJ6v37XYMvc+5E3altSnM1h1Q+53FIKrKDa9BIODMtjhbtMyN+HN3eWqL60ChMP51Da9JhBJeM9sVVIPXGh97hLpLgZso6OOY5J8o2EHhuzRlJH446hE5M91YUyeDxYmDK7MKIB0stq2wQmtk/ZvwKOtPB+bY465wsL/PWdMdF5Gi6mfVI7/mCQg6Z2eSHSv69B2m4bVTj+BfnWPkLCLLKqRk+gWcOx8JWguNpe6suMytF5nxsUDwNvIhf12owZtdf/Sz3NwCtlMAu1am9ovlHa3kVfcI23+BG0yIKUSCGRrcAZiiuWp2+9lZMy4alxCfIlMzmSlXGvU6l+qXXiw529U13b2jVnawPFfL7ckykHtomrp5aHN5oG57HvXNvG9OT51YgpNoFPymmV25vdPvbiK4M8GID0wN098W4I1wu3gjdYyM0DkpADEsHBwkGSGT2opv7a2WgIbbrFfSrYe1Eld2FNBDuVo458/BosW/JGhCM=#012np0005548799.ooo.test,192.168.122.107,np0005548799* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE5tEm7I4zPaGmz9Z+Sf+HA50c9qAUxE1ntfrsg9BwPk#012np0005548799.ooo.test,192.168.122.107,np0005548799* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCXq8DQhCyGLJHPF116AvSKMr1NdvnSDwKn6fgQmHY+Jpb8YywYfMC35usaht5YxiZ2zuRsX3Y9I4pX9C5kBW5c=#012np0005548795.ooo.test,192.168.122.103,np0005548795* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD3KU9mb6p28DebxEf8mP7XCcYgi43vzXLS1Xubej5l9h+8Rxn/ktXBwRzx/PBl3Jx6i4DJdKsaVn95cgc5pCWvoFdb4KYzCYevvoKyb09GDHVu2Lg7emEiU2vGo5+l6iIq21eQCqydMat04GLM4HxW9iQNb6Wsyosx4ETls3AJT5Xyor+Mb2RO98BphGiFrYdKSkDNqr5O4WxFJZKCLOb7jGtCyU7+ufIWlf6Ek+voyFieFqaT+IQatNZR2Ca6amVR2z+HsXgI3jihiv8hN2JloMneS/xVFbQhPDg0Gz463t4cyOJ1STI/QN6swtr75kFwQxQLatvjFGzMKoRWbfwM3qSzjWfDsHcftGqzMWAoZykr/2DtUm4P3B5tE8tSvNduQeq/QmxlPSaPlYTfFDN08wUl0NL8wp7NfcpndUpbGNOR/U6r+K5Df3OrKuI5rfaBBEd1YpLS4W/ichiMBUBbuR6YEgsMN/CwIDPlMEl2/VJJEK5CRK4vrQEGx7ac1S0=#012np0005548795.ooo.test,192.168.122.103,np0005548795* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJLY8LO1XH7237lWlevoPfkQi8/5mU/VWNaAeSZxG5Zp#012np0005548795.ooo.test,192.168.122.103,np0005548795* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGo2KOOFxubzvYy4e+z1TJf1C+oFKPA/MQ6aU4o0u+aYDGt9hG0UmKioaEGYFa6Wy6dwUmHvJKtgNfVn2R9yB50=#012np0005548797.ooo.test,192.168.122.105,np0005548797* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBZTxICKoVZWZG3oNtrLw3Aw+m5nWTJUqSEpXuFquaYOpqPDtsfIQY7Z9Apj4A6n8BJn0joYg2zQhNqs/+O7MNR1mf/ywXJzvZ0BoDWFISNWvWOom/sJKWs6UyxSo9rlAeWH4bAy12WAgF+xmvBaRj74ZTk6AmIw8loYWgRUC8K7r5uVWZ+FWPMqGAeKTjGhFUuWhU4zwB4pLwferi73BmQ32IDnSEOcwMUWbkXoN99JVByb0GXPZlk+wRMc1CrTMzS2rWsDFNqmKAhsL0eKVqNz7sXRA5djfpsSob4SqC96gQpX5lIhfc3CYcFc6HA7SLrGaky/wmmP949K02dqviQeUOqpM4pllYBCJKLZky/vWiGaUqg6aBZ+lSfWxBXz+5HeymsvnJs+UUaYYNF7WoLTAzxKoegITIKgYmip37nNxWApeDVYOQEdGRIlF4Ge7q4ZteT1rk2lWeqUpNMXpeKijqhmAefCfsf4Hpc3t6dPKFvSuHrKv/MzYO1+Zn4Ic=#012np0005548797.ooo.test,192.168.122.105,np0005548797* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILw6ImRHofTo9Mn4khI0nFLxLMba5wfHKciGvscK+bD3#012np0005548797.ooo.test,192.168.122.105,np0005548797* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC0uB/y2ldk42DWN0l6E1wZ3I5tPsJouLdy4uRYKgZ2l86iUu8L8GdIn5FUKkkbxx5i4quEoTJt8PGC46Cd02mM=#012 create=True mode=0644 path=/tmp/ansible.znvcoyru state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:20 localhost python3.9[118704]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.znvcoyru' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:27:21 localhost python3.9[118798]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.znvcoyru state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:22 localhost systemd[1]: session-31.scope: Deactivated successfully. Dec 6 04:27:22 localhost systemd[1]: session-31.scope: Consumed 4.215s CPU time. Dec 6 04:27:22 localhost systemd-logind[760]: Session 31 logged out. Waiting for processes to exit. Dec 6 04:27:22 localhost systemd-logind[760]: Removed session 31. Dec 6 04:27:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45746 DF PROTO=TCP SPT=51012 DPT=9882 SEQ=1484019285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB05AF0000000001030307) Dec 6 04:27:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52439 DF PROTO=TCP SPT=53618 DPT=9102 SEQ=48390082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB086A0000000001030307) Dec 6 04:27:28 localhost sshd[118813]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:27:28 localhost systemd-logind[760]: New session 32 of user zuul. Dec 6 04:27:28 localhost systemd[1]: Started Session 32 of User zuul. Dec 6 04:27:29 localhost python3.9[118906]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:27:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22034 DF PROTO=TCP SPT=47926 DPT=9101 SEQ=3289398072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB20B70000000001030307) Dec 6 04:27:31 localhost python3.9[119002]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 6 04:27:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15140 DF PROTO=TCP SPT=58824 DPT=9100 SEQ=259924725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB2BFB0000000001030307) Dec 6 04:27:33 localhost python3.9[119096]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:27:35 localhost python3.9[119189]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:27:35 localhost python3.9[119282]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:27:36 localhost python3.9[119376]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:27:37 localhost python3.9[119471]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:37 localhost systemd[1]: session-32.scope: Deactivated successfully. Dec 6 04:27:37 localhost systemd[1]: session-32.scope: Consumed 3.994s CPU time. Dec 6 04:27:37 localhost systemd-logind[760]: Session 32 logged out. Waiting for processes to exit. Dec 6 04:27:37 localhost systemd-logind[760]: Removed session 32. Dec 6 04:27:43 localhost sshd[119487]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:27:43 localhost systemd-logind[760]: New session 33 of user zuul. Dec 6 04:27:43 localhost systemd[1]: Started Session 33 of User zuul. Dec 6 04:27:44 localhost python3.9[119580]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:27:45 localhost python3.9[119676]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:27:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20206 DF PROTO=TCP SPT=37548 DPT=9105 SEQ=3728894940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB5F150000000001030307) Dec 6 04:27:46 localhost python3.9[119730]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 6 04:27:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20207 DF PROTO=TCP SPT=37548 DPT=9105 SEQ=3728894940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB63070000000001030307) Dec 6 04:27:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20208 DF PROTO=TCP SPT=37548 DPT=9105 SEQ=3728894940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB6B070000000001030307) Dec 6 04:27:50 localhost python3.9[119822]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:27:52 localhost python3.9[119915]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:52 localhost python3.9[120007]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20209 DF PROTO=TCP SPT=37548 DPT=9105 SEQ=3728894940 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB7AC80000000001030307) Dec 6 04:27:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52712 DF PROTO=TCP SPT=48364 DPT=9882 SEQ=4024331905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB7ADE0000000001030307) Dec 6 04:27:53 localhost python3.9[120099]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:27:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25381 DF PROTO=TCP SPT=37880 DPT=9102 SEQ=3095273692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB7D990000000001030307) Dec 6 04:27:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52713 DF PROTO=TCP SPT=48364 DPT=9882 SEQ=4024331905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB7EC70000000001030307) Dec 6 04:27:54 localhost python3.9[120189]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:27:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25382 DF PROTO=TCP SPT=37880 DPT=9102 SEQ=3095273692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB81870000000001030307) Dec 6 04:27:55 localhost python3.9[120279]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:27:55 localhost python3.9[120371]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:27:56 localhost systemd[1]: session-33.scope: Deactivated successfully. Dec 6 04:27:56 localhost systemd[1]: session-33.scope: Consumed 8.796s CPU time. Dec 6 04:27:56 localhost systemd-logind[760]: Session 33 logged out. Waiting for processes to exit. Dec 6 04:27:56 localhost systemd-logind[760]: Removed session 33. Dec 6 04:27:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52714 DF PROTO=TCP SPT=48364 DPT=9882 SEQ=4024331905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB86C80000000001030307) Dec 6 04:27:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25383 DF PROTO=TCP SPT=37880 DPT=9102 SEQ=3095273692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB89880000000001030307) Dec 6 04:28:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51927 DF PROTO=TCP SPT=60214 DPT=9101 SEQ=4198955684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB95E90000000001030307) Dec 6 04:28:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52715 DF PROTO=TCP SPT=48364 DPT=9882 SEQ=4024331905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB96870000000001030307) Dec 6 04:28:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51928 DF PROTO=TCP SPT=60214 DPT=9101 SEQ=4198955684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAB9A070000000001030307) Dec 6 04:28:01 localhost sshd[120386]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:28:01 localhost systemd-logind[760]: New session 34 of user zuul. Dec 6 04:28:01 localhost systemd[1]: Started Session 34 of User zuul. Dec 6 04:28:02 localhost python3.9[120479]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:28:04 localhost python3.9[120575]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:05 localhost python3.9[120667]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:05 localhost python3.9[120759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40620 DF PROTO=TCP SPT=42668 DPT=9100 SEQ=1481247212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABAD470000000001030307) Dec 6 04:28:06 localhost python3.9[120832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013285.2878785-161-228609674249337/.source.crt _original_basename=np0005548798.ooo.test-tls.crt follow=False checksum=1ab76074a88ae8df4c343d43f8b8898a42f6e463 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:06 localhost python3.9[120924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51930 DF PROTO=TCP SPT=60214 DPT=9101 SEQ=4198955684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABB1C70000000001030307) Dec 6 04:28:08 localhost python3.9[120997]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013286.529109-161-17867241803943/.source.crt _original_basename=np0005548798.ooo.test-ca.crt follow=False checksum=9f55ae15a25544860c141998074fdeb6ae0b6b42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:08 localhost python3.9[121089]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:10 localhost python3.9[121162]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013288.5261211-161-80485989267621/.source.key _original_basename=np0005548798.ooo.test-tls.key follow=False checksum=cbf9d1ac99437f79f8f44e01e3bbc92afbf9c867 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40621 DF PROTO=TCP SPT=42668 DPT=9100 SEQ=1481247212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABBD070000000001030307) Dec 6 04:28:10 localhost python3.9[121254]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:11 localhost python3.9[121346]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:11 localhost chronyd[112781]: Selected source 167.160.187.12 (pool.ntp.org) Dec 6 04:28:11 localhost python3.9[121438]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:12 localhost python3.9[121511]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013291.5191274-344-92144838183977/.source.crt _original_basename=np0005548798.ooo.test-tls.crt follow=False checksum=5d12896bc37451261d3de6143520b49250401331 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:13 localhost python3.9[121603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:13 localhost python3.9[121676]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013292.5954845-344-99044767427091/.source.crt _original_basename=np0005548798.ooo.test-ca.crt follow=False checksum=9f55ae15a25544860c141998074fdeb6ae0b6b42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:14 localhost python3.9[121768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:14 localhost python3.9[121841]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013293.6726878-344-173641248346840/.source.key _original_basename=np0005548798.ooo.test-tls.key follow=False checksum=d9c37efb60cb854d63b47d5fea6db9cc2bd9d631 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:15 localhost python3.9[121933]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51931 DF PROTO=TCP SPT=60214 DPT=9101 SEQ=4198955684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABD1870000000001030307) Dec 6 04:28:16 localhost python3.9[122025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28364 DF PROTO=TCP SPT=38340 DPT=9105 SEQ=518469516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABD4440000000001030307) Dec 6 04:28:16 localhost python3.9[122117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:18 localhost python3.9[122190]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013296.2140598-520-130016859194043/.source.crt _original_basename=np0005548798.ooo.test-tls.crt follow=False checksum=9628884c25f574475c79dfbc4ee51d9672081687 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:18 localhost python3.9[122282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:19 localhost python3.9[122355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013298.15763-520-191777011710096/.source.crt _original_basename=np0005548798.ooo.test-ca.crt follow=False checksum=9f55ae15a25544860c141998074fdeb6ae0b6b42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28366 DF PROTO=TCP SPT=38340 DPT=9105 SEQ=518469516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABE0470000000001030307) Dec 6 04:28:20 localhost python3.9[122447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:20 localhost python3.9[122520]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013299.764029-520-71193022617990/.source.key _original_basename=np0005548798.ooo.test-tls.key follow=False checksum=e4e797a8329b0caf07c196bed3ea38e666181d9a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:21 localhost python3.9[122612]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:22 localhost python3.9[122704]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:22 localhost python3.9[122796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:23 localhost python3.9[122869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013302.2316582-699-253723887968169/.source.crt _original_basename=np0005548798.ooo.test-tls.crt follow=False checksum=cba5e32c855f8278b52176c433fe1e056049c198 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28367 DF PROTO=TCP SPT=38340 DPT=9105 SEQ=518469516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABF0080000000001030307) Dec 6 04:28:23 localhost python3.9[122961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:24 localhost python3.9[123034]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013303.3726108-699-48306605364122/.source.crt _original_basename=np0005548798.ooo.test-ca.crt follow=False checksum=9f55ae15a25544860c141998074fdeb6ae0b6b42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:24 localhost python3.9[123126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52717 DF PROTO=TCP SPT=48364 DPT=9882 SEQ=4024331905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DABF7870000000001030307) Dec 6 04:28:25 localhost python3.9[123199]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013304.4427924-699-245552899617151/.source.key _original_basename=np0005548798.ooo.test-tls.key follow=False checksum=e6e531b939b22e96cb38205b87d01125bb9c48b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:26 localhost python3.9[123291]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:27 localhost python3.9[123383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:28 localhost python3.9[123456]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013306.8003547-910-52580678742437/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:29 localhost python3.9[123548]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28409 DF PROTO=TCP SPT=43410 DPT=9101 SEQ=4231456484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC0B170000000001030307) Dec 6 04:28:30 localhost python3.9[123640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28410 DF PROTO=TCP SPT=43410 DPT=9101 SEQ=4231456484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC0F070000000001030307) Dec 6 04:28:31 localhost python3.9[123713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013309.582931-984-163172570265085/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:31 localhost auditd[726]: Audit daemon rotating log files Dec 6 04:28:31 localhost python3.9[123805]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:32 localhost python3.9[123897]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:33 localhost python3.9[123970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013312.125524-1060-182711267480885/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:33 localhost python3.9[124062]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:34 localhost python3.9[124154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:34 localhost python3.9[124227]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013313.8159761-1130-233465638365139/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40623 DF PROTO=TCP SPT=42668 DPT=9100 SEQ=1481247212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC1D870000000001030307) Dec 6 04:28:35 localhost python3.9[124319]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:35 localhost python3.9[124411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:36 localhost python3.9[124484]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013315.5757177-1201-66891087786686/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:37 localhost python3.9[124576]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28412 DF PROTO=TCP SPT=43410 DPT=9101 SEQ=4231456484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC26C70000000001030307) Dec 6 04:28:37 localhost python3.9[124668]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:38 localhost python3.9[124741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013317.3224459-1273-264087443469326/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:39 localhost sshd[124801]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:28:39 localhost python3.9[124835]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34850 DF PROTO=TCP SPT=55436 DPT=9100 SEQ=283283153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC32070000000001030307) Dec 6 04:28:40 localhost python3.9[124927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:41 localhost python3.9[125000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013319.8468122-1342-171205668574352/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:42 localhost python3.9[125092]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:42 localhost python3.9[125184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:28:43 localhost python3.9[125257]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013322.373194-1409-219854294319572/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e9e428e84c1276c30c877c60b9a85275252eb420 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:28:43 localhost systemd[1]: session-34.scope: Deactivated successfully. Dec 6 04:28:43 localhost systemd[1]: session-34.scope: Consumed 25.546s CPU time. Dec 6 04:28:43 localhost systemd-logind[760]: Session 34 logged out. Waiting for processes to exit. Dec 6 04:28:43 localhost systemd-logind[760]: Removed session 34. Dec 6 04:28:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28413 DF PROTO=TCP SPT=43410 DPT=9101 SEQ=4231456484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC47880000000001030307) Dec 6 04:28:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25522 DF PROTO=TCP SPT=45162 DPT=9105 SEQ=1924738787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC49750000000001030307) Dec 6 04:28:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25524 DF PROTO=TCP SPT=45162 DPT=9105 SEQ=1924738787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC55870000000001030307) Dec 6 04:28:49 localhost sshd[125272]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:28:50 localhost systemd-logind[760]: New session 35 of user zuul. Dec 6 04:28:50 localhost systemd[1]: Started Session 35 of User zuul. Dec 6 04:28:50 localhost python3.9[125365]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:28:52 localhost python3.9[125461]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42713 DF PROTO=TCP SPT=36746 DPT=9882 SEQ=421927726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC653E0000000001030307) Dec 6 04:28:53 localhost python3.9[125553]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:28:54 localhost python3.9[125643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:28:55 localhost python3.9[125735]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Dec 6 04:28:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13438 DF PROTO=TCP SPT=34408 DPT=9102 SEQ=3124304201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC6F870000000001030307) Dec 6 04:28:56 localhost python3.9[125827]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:28:57 localhost python3.9[125881]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:29:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49963 DF PROTO=TCP SPT=54224 DPT=9101 SEQ=2522517281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC80470000000001030307) Dec 6 04:29:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49964 DF PROTO=TCP SPT=54224 DPT=9101 SEQ=2522517281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC84470000000001030307) Dec 6 04:29:02 localhost python3.9[125976]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:29:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34852 DF PROTO=TCP SPT=55436 DPT=9100 SEQ=283283153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC91880000000001030307) Dec 6 04:29:05 localhost python3[126071]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Dec 6 04:29:06 localhost python3.9[126163]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:06 localhost python3.9[126255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40624 DF PROTO=TCP SPT=42668 DPT=9100 SEQ=1481247212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAC9B870000000001030307) Dec 6 04:29:07 localhost python3.9[126303]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:07 localhost python3.9[126395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:08 localhost python3.9[126443]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.clv289yt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:09 localhost python3.9[126535]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:09 localhost python3.9[126583]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11036 DF PROTO=TCP SPT=47448 DPT=9100 SEQ=3077810842 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACA7480000000001030307) Dec 6 04:29:10 localhost python3.9[126675]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:29:11 localhost python3[126768]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Dec 6 04:29:12 localhost python3.9[126860]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:13 localhost python3.9[126935]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013351.5292325-431-207403851125022/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:14 localhost python3.9[127027]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:15 localhost python3.9[127102]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013353.506543-476-130889100338798/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49967 DF PROTO=TCP SPT=54224 DPT=9101 SEQ=2522517281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACBB880000000001030307) Dec 6 04:29:15 localhost python3.9[127194]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43109 DF PROTO=TCP SPT=54252 DPT=9105 SEQ=3442172430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACBEA80000000001030307) Dec 6 04:29:16 localhost python3.9[127269]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013355.2262201-521-61664299061928/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:16 localhost python3.9[127361]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:17 localhost python3.9[127436]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013356.4442081-566-145794339913608/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:18 localhost python3.9[127528]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:18 localhost python3.9[127603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013357.5957325-611-145267506504735/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43111 DF PROTO=TCP SPT=54252 DPT=9105 SEQ=3442172430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACCAC70000000001030307) Dec 6 04:29:19 localhost python3.9[127695]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:20 localhost python3.9[127787]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:29:20 localhost python3.9[127882]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:21 localhost python3.9[127974]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:29:22 localhost python3.9[128067]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:29:22 localhost python3.9[128161]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:29:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24964 DF PROTO=TCP SPT=53384 DPT=9882 SEQ=601353872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACDA6E0000000001030307) Dec 6 04:29:23 localhost python3.9[128256]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:25 localhost python3.9[128346]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:29:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6183 DF PROTO=TCP SPT=56180 DPT=9102 SEQ=984297145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACE3880000000001030307) Dec 6 04:29:26 localhost python3.9[128439]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005548798.ooo.test external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:a2:0d:dc:1c" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:29:26 localhost ovs-vsctl[128440]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005548798.ooo.test external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:a2:0d:dc:1c external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Dec 6 04:29:27 localhost python3.9[128532]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:29:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13439 DF PROTO=TCP SPT=34408 DPT=9102 SEQ=3124304201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACED870000000001030307) Dec 6 04:29:28 localhost python3.9[128625]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:29:29 localhost python3.9[128719]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:29:30 localhost python3.9[128811]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:30 localhost python3.9[128859]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:29:31 localhost python3.9[128951]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41513 DF PROTO=TCP SPT=52664 DPT=9101 SEQ=2057404460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DACF9870000000001030307) Dec 6 04:29:31 localhost python3.9[128999]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:29:32 localhost python3.9[129091]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:32 localhost python3.9[129183]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:33 localhost python3.9[129231]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:34 localhost python3.9[129323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28415 DF PROTO=TCP SPT=43410 DPT=9101 SEQ=4231456484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD05870000000001030307) Dec 6 04:29:34 localhost python3.9[129371]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:35 localhost python3.9[129463]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:29:35 localhost systemd[1]: Reloading. Dec 6 04:29:35 localhost systemd-sysv-generator[129494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:29:35 localhost systemd-rc-local-generator[129487]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:29:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:29:37 localhost python3.9[129593]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41515 DF PROTO=TCP SPT=52664 DPT=9101 SEQ=2057404460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD11470000000001030307) Dec 6 04:29:37 localhost python3.9[129641]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:38 localhost python3.9[129733]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:38 localhost python3.9[129781]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:39 localhost python3.9[129873]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:29:39 localhost systemd[1]: Reloading. Dec 6 04:29:39 localhost systemd-rc-local-generator[129900]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:29:39 localhost systemd-sysv-generator[129903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:29:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:29:39 localhost systemd[1]: Starting Create netns directory... Dec 6 04:29:40 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 04:29:40 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 04:29:40 localhost systemd[1]: Finished Create netns directory. Dec 6 04:29:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64006 DF PROTO=TCP SPT=52086 DPT=9100 SEQ=3320082136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD1C880000000001030307) Dec 6 04:29:40 localhost python3.9[130009]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:29:41 localhost python3.9[130101]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:41 localhost python3.9[130174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013380.9447305-1343-174974276378949/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:29:42 localhost python3.9[130266]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:29:43 localhost python3.9[130358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:29:44 localhost python3.9[130433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013382.9771645-1418-19303973362330/.source.json _original_basename=.lzwstypq follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:44 localhost python3.9[130525]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41516 DF PROTO=TCP SPT=52664 DPT=9101 SEQ=2057404460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD31880000000001030307) Dec 6 04:29:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5175 DF PROTO=TCP SPT=54916 DPT=9105 SEQ=236907706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD33D50000000001030307) Dec 6 04:29:47 localhost python3.9[130782]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Dec 6 04:29:48 localhost python3.9[130874]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:29:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5177 DF PROTO=TCP SPT=54916 DPT=9105 SEQ=236907706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD3FC70000000001030307) Dec 6 04:29:49 localhost python3.9[130966]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 6 04:29:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5178 DF PROTO=TCP SPT=54916 DPT=9105 SEQ=236907706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD4F870000000001030307) Dec 6 04:29:53 localhost python3[131084]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:29:53 localhost python3[131084]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c",#012 "Digest": "sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-01T06:38:47.246477714Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 345722821,#012 "VirtualSize": 345722821,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",#012 "sha256:ba9362d2aeb297e34b0679b2fc8168350c70a5b0ec414daf293bf2bc013e9088",#012 "sha256:aae3b8a85314314b9db80a043fdf3f3b1d0b69927faca0303c73969a23dddd0f"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-01T06:09:28.025707917Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025744608Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025767729Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025791379Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.02581523Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025867611Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.469442331Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:02.029095017Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:05.672474685Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-l Dec 6 04:29:53 localhost podman[131131]: 2025-12-06 09:29:53.925040703 +0000 UTC m=+0.073648162 container remove 2e3f1281c5ef7bbf567fefddb3ce24d8526994f52b3df6b7c58660e223635120 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/pki/tls/certs/ovn_controller.crt:/etc/pki/tls/certs/ovn_controller.crt', '/etc/pki/tls/private/ovn_controller.key:/etc/pki/tls/private/ovn_controller.key']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 6 04:29:53 localhost python3[131084]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Dec 6 04:29:54 localhost podman[131143]: Dec 6 04:29:54 localhost podman[131143]: 2025-12-06 09:29:54.038985032 +0000 UTC m=+0.097283124 container create da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 04:29:54 localhost podman[131143]: 2025-12-06 09:29:53.979468859 +0000 UTC m=+0.037767031 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Dec 6 04:29:54 localhost python3[131084]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Dec 6 04:29:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24969 DF PROTO=TCP SPT=53384 DPT=9882 SEQ=601353872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD57870000000001030307) Dec 6 04:29:55 localhost python3.9[131269]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:29:56 localhost python3.9[131363]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:57 localhost python3.9[131409]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:29:57 localhost python3.9[131500]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013397.3951306-1682-232401890849810/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:29:58 localhost python3.9[131546]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:29:58 localhost systemd[1]: Reloading. Dec 6 04:29:58 localhost systemd-rc-local-generator[131574]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:29:58 localhost systemd-sysv-generator[131577]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:29:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:29:59 localhost python3.9[131628]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:29:59 localhost systemd[1]: Reloading. Dec 6 04:29:59 localhost systemd-rc-local-generator[131653]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:29:59 localhost systemd-sysv-generator[131656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:29:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:29:59 localhost systemd[1]: Starting ovn_controller container... Dec 6 04:29:59 localhost systemd[1]: Started libcrun container. Dec 6 04:29:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ac91a182dea8792257b83e46114031ded18b602c492920e3e2aed2fc342d8/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Dec 6 04:29:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:29:59 localhost podman[131670]: 2025-12-06 09:29:59.84351537 +0000 UTC m=+0.143389981 container init da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3) Dec 6 04:29:59 localhost ovn_controller[131684]: + sudo -E kolla_set_configs Dec 6 04:29:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:29:59 localhost podman[131670]: 2025-12-06 09:29:59.880451394 +0000 UTC m=+0.180325955 container start da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:29:59 localhost edpm-start-podman-container[131670]: ovn_controller Dec 6 04:29:59 localhost systemd[1]: Created slice User Slice of UID 0. Dec 6 04:29:59 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 6 04:29:59 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 6 04:29:59 localhost systemd[1]: Starting User Manager for UID 0... Dec 6 04:29:59 localhost podman[131691]: 2025-12-06 09:29:59.954332573 +0000 UTC m=+0.072163996 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller) Dec 6 04:29:59 localhost podman[131691]: 2025-12-06 09:29:59.967301564 +0000 UTC m=+0.085133007 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:29:59 localhost podman[131691]: unhealthy Dec 6 04:29:59 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:29:59 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Failed with result 'exit-code'. Dec 6 04:30:00 localhost edpm-start-podman-container[131669]: Creating additional drop-in dependency for "ovn_controller" (da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436) Dec 6 04:30:00 localhost systemd[1]: Reloading. Dec 6 04:30:00 localhost systemd[131715]: Queued start job for default target Main User Target. Dec 6 04:30:00 localhost systemd[131715]: Created slice User Application Slice. Dec 6 04:30:00 localhost systemd[131715]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 6 04:30:00 localhost systemd[131715]: Started Daily Cleanup of User's Temporary Directories. Dec 6 04:30:00 localhost systemd[131715]: Reached target Paths. Dec 6 04:30:00 localhost systemd[131715]: Reached target Timers. Dec 6 04:30:00 localhost systemd[131715]: Starting D-Bus User Message Bus Socket... Dec 6 04:30:00 localhost systemd[131715]: Starting Create User's Volatile Files and Directories... Dec 6 04:30:00 localhost systemd[131715]: Finished Create User's Volatile Files and Directories. Dec 6 04:30:00 localhost systemd[131715]: Listening on D-Bus User Message Bus Socket. Dec 6 04:30:00 localhost systemd[131715]: Reached target Sockets. Dec 6 04:30:00 localhost systemd[131715]: Reached target Basic System. Dec 6 04:30:00 localhost systemd[131715]: Reached target Main User Target. Dec 6 04:30:00 localhost systemd[131715]: Startup finished in 110ms. Dec 6 04:30:00 localhost systemd-rc-local-generator[131769]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:30:00 localhost systemd-sysv-generator[131775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:30:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:30:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53524 DF PROTO=TCP SPT=59344 DPT=9101 SEQ=4190135346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD6AA70000000001030307) Dec 6 04:30:00 localhost systemd[1]: Started User Manager for UID 0. Dec 6 04:30:00 localhost systemd[1]: Started ovn_controller container. Dec 6 04:30:00 localhost systemd[1]: Started Session c13 of User root. Dec 6 04:30:00 localhost ovn_controller[131684]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:30:00 localhost ovn_controller[131684]: INFO:__main__:Validating config file Dec 6 04:30:00 localhost ovn_controller[131684]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:30:00 localhost ovn_controller[131684]: INFO:__main__:Writing out command to execute Dec 6 04:30:00 localhost systemd[1]: session-c13.scope: Deactivated successfully. Dec 6 04:30:00 localhost ovn_controller[131684]: ++ cat /run_command Dec 6 04:30:00 localhost ovn_controller[131684]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt ' Dec 6 04:30:00 localhost ovn_controller[131684]: + ARGS= Dec 6 04:30:00 localhost ovn_controller[131684]: + sudo kolla_copy_cacerts Dec 6 04:30:00 localhost systemd[1]: Started Session c14 of User root. Dec 6 04:30:00 localhost systemd[1]: session-c14.scope: Deactivated successfully. Dec 6 04:30:00 localhost ovn_controller[131684]: + [[ ! -n '' ]] Dec 6 04:30:00 localhost ovn_controller[131684]: + . kolla_extend_start Dec 6 04:30:00 localhost ovn_controller[131684]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt ' Dec 6 04:30:00 localhost ovn_controller[131684]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\''' Dec 6 04:30:00 localhost ovn_controller[131684]: + umask 0022 Dec 6 04:30:00 localhost ovn_controller[131684]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8] Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00004|main|INFO|OVS IDL reconnected, force recompute. Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting... Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00013|main|INFO|OVS feature set changed, force recompute. Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute. Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00021|main|INFO|OVS feature set changed, force recompute. Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-d3c7df-0 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-ded858-0 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-719bf6-0 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00026|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00027|binding|INFO|Claiming lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 for this chassis. Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00028|binding|INFO|227fe5b2-a5a7-4043-b641-32b6e7c7a7c1: Claiming fa:16:3e:91:02:64 192.168.0.189 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00029|binding|INFO|Removing lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 ovn-installed in OVS Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-d3c7df-0 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-ded858-0 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-719bf6-0 Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00033|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00034|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00035|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00036|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00037|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:00 localhost ovn_controller[131684]: 2025-12-06T09:30:00Z|00038|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:01 localhost python3.9[131885]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:30:01 localhost ovs-vsctl[131886]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Dec 6 04:30:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53525 DF PROTO=TCP SPT=59344 DPT=9101 SEQ=4190135346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD6EC70000000001030307) Dec 6 04:30:01 localhost ovn_controller[131684]: 2025-12-06T09:30:01Z|00039|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:01 localhost python3.9[131978]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:30:01 localhost ovs-vsctl[131980]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Dec 6 04:30:02 localhost ovn_controller[131684]: 2025-12-06T09:30:02Z|00040|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:02 localhost ovn_controller[131684]: 2025-12-06T09:30:02Z|00041|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:30:02 localhost python3.9[132073]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:30:02 localhost ovs-vsctl[132074]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Dec 6 04:30:03 localhost systemd[1]: session-35.scope: Deactivated successfully. Dec 6 04:30:03 localhost systemd[1]: session-35.scope: Consumed 40.316s CPU time. Dec 6 04:30:03 localhost systemd-logind[760]: Session 35 logged out. Waiting for processes to exit. Dec 6 04:30:03 localhost systemd-logind[760]: Removed session 35. Dec 6 04:30:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64008 DF PROTO=TCP SPT=52086 DPT=9100 SEQ=3320082136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD7D870000000001030307) Dec 6 04:30:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53527 DF PROTO=TCP SPT=59344 DPT=9101 SEQ=4190135346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD86870000000001030307) Dec 6 04:30:08 localhost ovn_controller[131684]: 2025-12-06T09:30:08Z|00042|binding|INFO|Setting lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 ovn-installed in OVS Dec 6 04:30:08 localhost ovn_controller[131684]: 2025-12-06T09:30:08Z|00043|binding|INFO|Setting lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 up in Southbound Dec 6 04:30:09 localhost sshd[132089]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:30:09 localhost systemd-logind[760]: New session 37 of user zuul. Dec 6 04:30:09 localhost systemd[1]: Started Session 37 of User zuul. Dec 6 04:30:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30007 DF PROTO=TCP SPT=48774 DPT=9100 SEQ=3248353749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAD91C80000000001030307) Dec 6 04:30:10 localhost systemd[1]: Stopping User Manager for UID 0... Dec 6 04:30:10 localhost systemd[131715]: Activating special unit Exit the Session... Dec 6 04:30:10 localhost systemd[131715]: Stopped target Main User Target. Dec 6 04:30:10 localhost systemd[131715]: Stopped target Basic System. Dec 6 04:30:10 localhost systemd[131715]: Stopped target Paths. Dec 6 04:30:10 localhost systemd[131715]: Stopped target Sockets. Dec 6 04:30:10 localhost systemd[131715]: Stopped target Timers. Dec 6 04:30:10 localhost systemd[131715]: Stopped Daily Cleanup of User's Temporary Directories. Dec 6 04:30:10 localhost systemd[131715]: Closed D-Bus User Message Bus Socket. Dec 6 04:30:10 localhost systemd[131715]: Stopped Create User's Volatile Files and Directories. Dec 6 04:30:10 localhost systemd[131715]: Removed slice User Application Slice. Dec 6 04:30:10 localhost systemd[131715]: Reached target Shutdown. Dec 6 04:30:10 localhost systemd[131715]: Finished Exit the Session. Dec 6 04:30:10 localhost systemd[131715]: Reached target Exit the Session. Dec 6 04:30:10 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 6 04:30:10 localhost systemd[1]: Stopped User Manager for UID 0. Dec 6 04:30:10 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 6 04:30:10 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 6 04:30:10 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 6 04:30:10 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 6 04:30:10 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 6 04:30:10 localhost python3.9[132182]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:30:11 localhost ovn_controller[131684]: 2025-12-06T09:30:11Z|00044|memory|INFO|18792 kB peak resident set size after 10.9 seconds Dec 6 04:30:11 localhost ovn_controller[131684]: 2025-12-06T09:30:11Z|00045|memory|INFO|idl-cells-OVN_Southbound:4081 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:199 lflow-cache-size-KB:290 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:157 ofctrl_installed_flow_usage-KB:114 ofctrl_sb_flow_ref_usage-KB:68 Dec 6 04:30:11 localhost python3.9[132280]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:12 localhost python3.9[132372]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:12 localhost python3.9[132464]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:13 localhost python3.9[132556]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:14 localhost python3.9[132648]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:14 localhost python3.9[132738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:30:15 localhost python3.9[132830]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Dec 6 04:30:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53528 DF PROTO=TCP SPT=59344 DPT=9101 SEQ=4190135346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADA7880000000001030307) Dec 6 04:30:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5900 DF PROTO=TCP SPT=34834 DPT=9105 SEQ=2037202156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADA9050000000001030307) Dec 6 04:30:16 localhost python3.9[132920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:17 localhost python3.9[132993]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013415.9690838-218-1266339461748/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:18 localhost python3.9[133084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5902 DF PROTO=TCP SPT=34834 DPT=9105 SEQ=2037202156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADB5070000000001030307) Dec 6 04:30:19 localhost python3.9[133157]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013417.8940213-263-109384219303317/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:20 localhost python3.9[133249]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:30:21 localhost python3.9[133303]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:30:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5903 DF PROTO=TCP SPT=34834 DPT=9105 SEQ=2037202156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADC4C70000000001030307) Dec 6 04:30:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59304 DF PROTO=TCP SPT=33750 DPT=9102 SEQ=1228625756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADCD870000000001030307) Dec 6 04:30:25 localhost python3.9[133397]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:30:26 localhost python3.9[133490]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:27 localhost python3.9[133561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013426.017839-374-156232193341594/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:27 localhost python3.9[133651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:28 localhost python3.9[133722]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013427.2125082-374-165553206171368/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:29 localhost python3.9[133812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:29 localhost python3.9[133883]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013428.878994-506-158430377699289/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52632 DF PROTO=TCP SPT=48622 DPT=9101 SEQ=1825126643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADDFD70000000001030307) Dec 6 04:30:30 localhost python3.9[133973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:30:30 localhost podman[133974]: 2025-12-06 09:30:30.426396062 +0000 UTC m=+0.079915620 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:30:30 localhost podman[133974]: 2025-12-06 09:30:30.468076248 +0000 UTC m=+0.121595836 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller) Dec 6 04:30:30 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:30:30 localhost python3.9[134069]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013429.901425-506-71355936467549/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52633 DF PROTO=TCP SPT=48622 DPT=9101 SEQ=1825126643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADE3C80000000001030307) Dec 6 04:30:31 localhost python3.9[134159]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:30:32 localhost python3.9[134253]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:33 localhost python3.9[134345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:33 localhost sshd[134394]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:30:33 localhost python3.9[134393]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41518 DF PROTO=TCP SPT=52664 DPT=9101 SEQ=2057404460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADEF870000000001030307) Dec 6 04:30:34 localhost python3.9[134487]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:34 localhost python3.9[134535]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:35 localhost python3.9[134627]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:30:36 localhost python3.9[134719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64009 DF PROTO=TCP SPT=52086 DPT=9100 SEQ=3320082136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DADFB870000000001030307) Dec 6 04:30:37 localhost python3.9[134767]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:30:37 localhost python3.9[134859]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:38 localhost ovn_controller[131684]: 2025-12-06T09:30:38Z|00046|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory Dec 6 04:30:38 localhost python3.9[134907]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:30:40 localhost python3.9[134999]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:30:40 localhost systemd[1]: Reloading. Dec 6 04:30:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19979 DF PROTO=TCP SPT=56182 DPT=9100 SEQ=906172445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE06C70000000001030307) Dec 6 04:30:40 localhost systemd-sysv-generator[135024]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:30:40 localhost systemd-rc-local-generator[135021]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:30:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:30:42 localhost python3.9[135129]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:42 localhost python3.9[135177]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:30:43 localhost python3.9[135269]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:43 localhost python3.9[135317]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:30:44 localhost python3.9[135409]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:30:44 localhost systemd[1]: Reloading. Dec 6 04:30:44 localhost systemd-sysv-generator[135437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:30:44 localhost systemd-rc-local-generator[135434]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:30:44 localhost systemd[1]: Starting dnf makecache... Dec 6 04:30:44 localhost systemd[1]: Starting Create netns directory... Dec 6 04:30:44 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 04:30:44 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 04:30:44 localhost systemd[1]: Finished Create netns directory. Dec 6 04:30:44 localhost dnf[135447]: Updating Subscription Management repositories. Dec 6 04:30:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52636 DF PROTO=TCP SPT=48622 DPT=9101 SEQ=1825126643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE1B870000000001030307) Dec 6 04:30:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48452 DF PROTO=TCP SPT=33210 DPT=9105 SEQ=247397639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE1E350000000001030307) Dec 6 04:30:46 localhost python3.9[135545]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:46 localhost dnf[135447]: Metadata cache refreshed recently. Dec 6 04:30:46 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Dec 6 04:30:46 localhost systemd[1]: Finished dnf makecache. Dec 6 04:30:46 localhost systemd[1]: dnf-makecache.service: Consumed 1.992s CPU time. Dec 6 04:30:47 localhost python3.9[135637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:47 localhost python3.9[135711]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013446.6153903-959-114765255693890/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:48 localhost python3.9[135803]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:30:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48454 DF PROTO=TCP SPT=33210 DPT=9105 SEQ=247397639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE2A480000000001030307) Dec 6 04:30:49 localhost python3.9[135895]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:30:50 localhost python3.9[135970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013449.1017-1034-91687017103027/.source.json _original_basename=.wxc2ashu follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:30:51 localhost python3.9[136062]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:30:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18294 DF PROTO=TCP SPT=37446 DPT=9882 SEQ=939165325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE3A000000000001030307) Dec 6 04:30:53 localhost python3.9[136319]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Dec 6 04:30:54 localhost python3.9[136411]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:30:55 localhost python3.9[136503]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 6 04:30:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54938 DF PROTO=TCP SPT=47464 DPT=9882 SEQ=1178515044 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE41870000000001030307) Dec 6 04:30:59 localhost python3[136621]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:30:59 localhost python3[136621]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9",#012 "Digest": "sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-01T06:29:20.327314945Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 784141054,#012 "VirtualSize": 784141054,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53/diff:/var/lib/containers/storage/overlay/2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",#012 "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",#012 "sha256:75abaaa40a93c0e2bba524b6f8d4eb5f1c4c9a33db70c892c7582ec5b0827e5e",#012 "sha256:01f43f620d1ea2a9e584abe0cc14c336bedcf55765127c000d743f536dd36f25",#012 "sha256:0bf5bd378602f28be423f5e84abddff3b103396fae3c167031b6e3fcfcf6f120"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-01T06:09:28.025707917Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025744608Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025767729Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025791379Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.02581523Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025867611Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.469442331Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:02.029095017Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf Dec 6 04:30:59 localhost podman[136670]: 2025-12-06 09:30:59.497164925 +0000 UTC m=+0.088679841 container remove 2af2360232242b90ffba8227148de64156d2fe65b1670d09dde4032219bb7368 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '78ca993e795bb2768fe880e03926b595'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/etc/pki/tls/certs/ovn_metadata.crt:/etc/pki/tls/certs/ovn_metadata.crt', '/etc/pki/tls/private/ovn_metadata.key:/etc/pki/tls/private/ovn_metadata.key']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Dec 6 04:30:59 localhost python3[136621]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Dec 6 04:30:59 localhost podman[136683]: Dec 6 04:30:59 localhost podman[136683]: 2025-12-06 09:30:59.603377437 +0000 UTC m=+0.087338028 container create 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:30:59 localhost podman[136683]: 2025-12-06 09:30:59.559815182 +0000 UTC m=+0.043775823 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 6 04:30:59 localhost python3[136621]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 6 04:31:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10571 DF PROTO=TCP SPT=41628 DPT=9101 SEQ=3112103628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE55070000000001030307) Dec 6 04:31:00 localhost python3.9[136813]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:31:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:31:01 localhost podman[136908]: 2025-12-06 09:31:01.140336275 +0000 UTC m=+0.090380636 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:31:01 localhost podman[136908]: 2025-12-06 09:31:01.177248177 +0000 UTC m=+0.127292508 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0) Dec 6 04:31:01 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:31:01 localhost python3.9[136907]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10572 DF PROTO=TCP SPT=41628 DPT=9101 SEQ=3112103628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE59070000000001030307) Dec 6 04:31:02 localhost python3.9[136977]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:31:03 localhost python3.9[137068]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013462.3826702-1298-176203382255443/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:03 localhost python3.9[137114]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:31:03 localhost systemd[1]: Reloading. Dec 6 04:31:03 localhost systemd-sysv-generator[137140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:31:03 localhost systemd-rc-local-generator[137137]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:31:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:31:04 localhost python3.9[137196]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53530 DF PROTO=TCP SPT=59344 DPT=9101 SEQ=4190135346 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE65880000000001030307) Dec 6 04:31:04 localhost systemd[1]: Reloading. Dec 6 04:31:04 localhost systemd-rc-local-generator[137223]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:31:04 localhost systemd-sysv-generator[137226]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:31:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:31:04 localhost systemd[1]: Starting ovn_metadata_agent container... Dec 6 04:31:04 localhost systemd[1]: tmp-crun.ZaN80X.mount: Deactivated successfully. Dec 6 04:31:04 localhost systemd[1]: Started libcrun container. Dec 6 04:31:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9897644607eff602831ae82a833b8826d7dc18407bf982dc8adb65104170d6b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 6 04:31:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9897644607eff602831ae82a833b8826d7dc18407bf982dc8adb65104170d6b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 04:31:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:31:05 localhost podman[137238]: 2025-12-06 09:31:05.000172866 +0000 UTC m=+0.209026228 container init 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + sudo -E kolla_set_configs Dec 6 04:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:31:05 localhost podman[137238]: 2025-12-06 09:31:05.03995665 +0000 UTC m=+0.248810002 container start 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:31:05 localhost edpm-start-podman-container[137238]: ovn_metadata_agent Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Validating config file Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Copying service configuration files Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Writing out command to execute Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/20509a6a-c438-4c5e-82a7-fe0ea272b309.conf Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: ++ cat /run_command Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + CMD=neutron-ovn-metadata-agent Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + ARGS= Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + sudo kolla_copy_cacerts Dec 6 04:31:05 localhost podman[137261]: 2025-12-06 09:31:05.116326276 +0000 UTC m=+0.072209684 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + [[ ! -n '' ]] Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + . kolla_extend_start Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: Running command: 'neutron-ovn-metadata-agent' Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + umask 0022 Dec 6 04:31:05 localhost ovn_metadata_agent[137254]: + exec neutron-ovn-metadata-agent Dec 6 04:31:05 localhost podman[137261]: 2025-12-06 09:31:05.202522998 +0000 UTC m=+0.158406436 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:31:05 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:31:05 localhost edpm-start-podman-container[137237]: Creating additional drop-in dependency for "ovn_metadata_agent" (34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3) Dec 6 04:31:05 localhost systemd[1]: Reloading. Dec 6 04:31:05 localhost systemd-sysv-generator[137326]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:31:05 localhost systemd-rc-local-generator[137323]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:31:05 localhost systemd[1]: Started ovn_metadata_agent container. Dec 6 04:31:06 localhost systemd[1]: session-37.scope: Deactivated successfully. Dec 6 04:31:06 localhost systemd[1]: session-37.scope: Consumed 31.429s CPU time. Dec 6 04:31:06 localhost systemd-logind[760]: Session 37 logged out. Waiting for processes to exit. Dec 6 04:31:06 localhost systemd-logind[760]: Removed session 37. Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.621 137259 INFO neutron.common.config [-] Logging enabled!#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.622 137259 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.622 137259 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.622 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.622 137259 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.622 137259 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.622 137259 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.622 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.623 137259 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.624 137259 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.625 137259 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.626 137259 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.627 137259 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.628 137259 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.629 137259 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.630 137259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.631 137259 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.632 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.633 137259 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.634 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.635 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.636 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.637 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.638 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.639 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.640 137259 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.641 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.642 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.643 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.644 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.645 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.646 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.647 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.648 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.649 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.650 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.651 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.652 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.652 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.652 137259 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.652 137259 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.661 137259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.661 137259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.661 137259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.661 137259 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.661 137259 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.677 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name a1cf5a35-de45-4f36-ac91-02296203a661 (UUID: a1cf5a35-de45-4f36-ac91-02296203a661) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.698 137259 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.698 137259 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.698 137259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.698 137259 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.701 137259 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.709 137259 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.717 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:02:64 192.168.0.189'], port_security=['fa:16:3e:91:02:64 192.168.0.189'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.189/24', 'neutron:device_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005548798.ooo.test', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '47835b89168945138751a4b216280589', 'neutron:revision_number': '7', 'neutron:security_group_ids': '2bda9e92-c0a1-4c1d-90ae-f2e7495954f8 db4a6c1e-fda3-423f-866c-b4772bef83b6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=66aef1d5-ef14-49e3-b4b5-f1e89f0f9ee8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.718 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'a1cf5a35-de45-4f36-ac91-02296203a661'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': 'fece9e00-8023-5a55-9474-efd3dfc1603e', 'neutron:ovn-metadata-sb-cfg': '1'}, name=a1cf5a35-de45-4f36-ac91-02296203a661, nb_cfg_timestamp=1765013410325, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.719 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 in datapath 20509a6a-c438-4c5e-82a7-fe0ea272b309 bound to our chassis on insert#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.719 137259 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.720 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.720 137259 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.720 137259 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.720 137259 INFO oslo_service.service [-] Starting 1 workers#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.723 137259 DEBUG oslo_service.service [-] Started child 137355 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.725 137355 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-479389'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.725 137259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20509a6a-c438-4c5e-82a7-fe0ea272b309#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.727 137259 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpod7xcmi2/privsep.sock']#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.747 137355 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.747 137355 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.747 137355 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.752 137355 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.758 137355 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Dec 6 04:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:06.765 137355 INFO eventlet.wsgi.server [-] (137355) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Dec 6 04:31:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10574 DF PROTO=TCP SPT=41628 DPT=9101 SEQ=3112103628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE70C70000000001030307) Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.343 137259 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.344 137259 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpod7xcmi2/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.228 137360 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.232 137360 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.234 137360 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.234 137360 INFO oslo.privsep.daemon [-] privsep daemon running as pid 137360#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.346 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[75a8950b-ffbf-4735-9181-17f9345cd777]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.758 137360 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.758 137360 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:31:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:07.758 137360 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.234 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b64ecf-3d63-4509-a8a6-7fa44902c619]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.236 137259 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp1ev86fyu/privsep.sock']#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.818 137259 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.819 137259 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1ev86fyu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.710 137371 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.716 137371 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.719 137371 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.720 137371 INFO oslo.privsep.daemon [-] privsep daemon running as pid 137371#033[00m Dec 6 04:31:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:08.822 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[525640a2-a0bc-46e4-af2f-767568568909]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.264 137371 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.264 137371 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.264 137371 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.761 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[f35901b2-06db-4c54-8a92-463de89a4199]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.764 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[34f785d9-8af6-43bd-acac-0daee579223f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.784 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[7495b2a1-b865-4272-832a-1167e9de5336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.802 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e5299be1-e5f2-4f1c-a842-7f9d489964fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20509a6a-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:3b:0a:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 83, 'rx_bytes': 8926, 'tx_bytes': 8133, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 83, 'rx_bytes': 8926, 'tx_bytes': 8133, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 670919, 'reachable_time': 20872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 137381, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.819 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[87199036-932a-4c9d-be0b-e68ede5447b7]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap20509a6a-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 670923, 'tstamp': 670923}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 137382, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap20509a6a-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 670926, 'tstamp': 670926}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 137382, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 670925, 'tstamp': 670925}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 137382, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3b:a81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 670919, 'tstamp': 670919}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 137382, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.870 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9b922b98-781d-4aad-9283-aa8f1e8f8aab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.872 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20509a6a-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.882 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20509a6a-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.882 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.883 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20509a6a-c0, col_values=(('external_ids', {'iface-id': 'dc760542-e03f-4d48-a573-fabb89636a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.884 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:31:09 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:09.888 137259 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmprh5kzaay/privsep.sock']#033[00m Dec 6 04:31:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5147 DF PROTO=TCP SPT=41712 DPT=9100 SEQ=2861710797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE7C080000000001030307) Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.465 137259 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.466 137259 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprh5kzaay/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.363 137391 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.369 137391 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.372 137391 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.372 137391 INFO oslo.privsep.daemon [-] privsep daemon running as pid 137391#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.470 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3a43cf-1794-4245-8524-3b7eb4fca057]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.919 137391 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.919 137391 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:31:10 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:10.919 137391 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.392 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9f4137-4d34-4604-92be-ae012b01a151]: (4, ['ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.396 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, column=external_ids, values=({'neutron:ovn-metadata-id': 'fece9e00-8023-5a55-9474-efd3dfc1603e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.397 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.398 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.413 137259 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.413 137259 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.413 137259 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.414 137259 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.414 137259 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.415 137259 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.415 137259 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.416 137259 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.416 137259 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.417 137259 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.417 137259 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.418 137259 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.418 137259 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.418 137259 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.419 137259 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.419 137259 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.420 137259 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.420 137259 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.421 137259 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.421 137259 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.421 137259 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.422 137259 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.423 137259 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.423 137259 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.423 137259 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.424 137259 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.425 137259 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.425 137259 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.426 137259 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.426 137259 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.426 137259 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.427 137259 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.427 137259 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.428 137259 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.428 137259 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.428 137259 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.429 137259 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.429 137259 DEBUG oslo_service.service [-] host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.430 137259 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.430 137259 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.431 137259 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.431 137259 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.431 137259 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.432 137259 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.432 137259 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.433 137259 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.433 137259 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.433 137259 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.434 137259 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.434 137259 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.435 137259 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.435 137259 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.435 137259 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.436 137259 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.436 137259 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.437 137259 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.437 137259 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.438 137259 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.438 137259 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.438 137259 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.439 137259 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.439 137259 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.440 137259 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.440 137259 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.440 137259 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.441 137259 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.441 137259 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.442 137259 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.442 137259 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.443 137259 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.443 137259 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.443 137259 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.444 137259 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.444 137259 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.445 137259 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.445 137259 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.446 137259 DEBUG oslo_service.service [-] nova_metadata_protocol = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.446 137259 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.446 137259 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.447 137259 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.447 137259 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.448 137259 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.448 137259 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.448 137259 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.449 137259 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.449 137259 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.450 137259 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.450 137259 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.450 137259 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.451 137259 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.451 137259 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.452 137259 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.452 137259 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.453 137259 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.453 137259 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.454 137259 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.454 137259 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.455 137259 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.455 137259 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.455 137259 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.455 137259 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.456 137259 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.456 137259 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.456 137259 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.456 137259 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.456 137259 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.457 137259 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.457 137259 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.457 137259 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.457 137259 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.458 137259 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.458 137259 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.458 137259 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.458 137259 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.459 137259 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.459 137259 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.459 137259 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.459 137259 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.460 137259 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.460 137259 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.460 137259 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.460 137259 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.461 137259 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.461 137259 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.461 137259 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.461 137259 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.462 137259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.462 137259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.462 137259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.462 137259 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.462 137259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.463 137259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.463 137259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.463 137259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.463 137259 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.464 137259 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.464 137259 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.464 137259 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.464 137259 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.465 137259 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.465 137259 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.465 137259 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.465 137259 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.465 137259 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.466 137259 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.466 137259 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.466 137259 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.466 137259 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.466 137259 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.467 137259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.467 137259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.467 137259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.467 137259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.468 137259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.468 137259 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.468 137259 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.468 137259 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.468 137259 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.469 137259 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.469 137259 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.469 137259 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.469 137259 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.469 137259 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.470 137259 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.470 137259 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.470 137259 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.470 137259 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.470 137259 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.471 137259 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.471 137259 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.471 137259 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.471 137259 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.471 137259 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.472 137259 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.472 137259 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.472 137259 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.472 137259 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.472 137259 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.473 137259 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.473 137259 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.473 137259 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.473 137259 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.474 137259 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.474 137259 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.474 137259 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.474 137259 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.474 137259 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.475 137259 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.475 137259 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.475 137259 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.475 137259 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.476 137259 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.476 137259 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.476 137259 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.476 137259 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.476 137259 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.477 137259 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.477 137259 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.477 137259 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.477 137259 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.478 137259 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.478 137259 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.478 137259 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.478 137259 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.478 137259 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.479 137259 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.479 137259 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.479 137259 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.479 137259 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.480 137259 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.480 137259 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.480 137259 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.480 137259 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.480 137259 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.481 137259 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.481 137259 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.481 137259 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.481 137259 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.481 137259 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.482 137259 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.482 137259 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.482 137259 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.482 137259 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.482 137259 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.483 137259 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.483 137259 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.483 137259 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.483 137259 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.484 137259 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.484 137259 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.484 137259 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.484 137259 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.484 137259 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.485 137259 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.485 137259 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.485 137259 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost sshd[137396]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.485 137259 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.485 137259 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.486 137259 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.486 137259 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.486 137259 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.486 137259 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.487 137259 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.487 137259 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.487 137259 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.488 137259 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.488 137259 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.488 137259 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.488 137259 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.488 137259 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.489 137259 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.489 137259 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.489 137259 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.489 137259 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.490 137259 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.490 137259 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.490 137259 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.490 137259 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.490 137259 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.491 137259 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.491 137259 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.491 137259 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.491 137259 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.491 137259 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.492 137259 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.492 137259 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.492 137259 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.492 137259 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.493 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.493 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.493 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.493 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.494 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.494 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.494 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.494 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.494 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.495 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.495 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.495 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.495 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.496 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.496 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.496 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.496 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.496 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.497 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.497 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.497 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.497 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.497 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.498 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.498 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.498 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.498 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.499 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.499 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.499 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.500 137259 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.500 137259 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.500 137259 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.500 137259 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.501 137259 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:31:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:31:11.501 137259 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 6 04:31:11 localhost systemd-logind[760]: New session 38 of user zuul. Dec 6 04:31:11 localhost systemd[1]: Started Session 38 of User zuul. Dec 6 04:31:12 localhost python3.9[137489]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:31:14 localhost python3.9[137585]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:15 localhost python3.9[137690]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:15 localhost systemd[1]: libpod-45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196.scope: Deactivated successfully. Dec 6 04:31:15 localhost podman[137691]: 2025-12-06 09:31:15.505617312 +0000 UTC m=+0.081178572 container died 45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4) Dec 6 04:31:15 localhost podman[137691]: 2025-12-06 09:31:15.539190198 +0000 UTC m=+0.114751408 container cleanup 45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 6 04:31:15 localhost podman[137705]: 2025-12-06 09:31:15.602248517 +0000 UTC m=+0.084233289 container remove 45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1) Dec 6 04:31:15 localhost systemd[1]: libpod-conmon-45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196.scope: Deactivated successfully. Dec 6 04:31:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10575 DF PROTO=TCP SPT=41628 DPT=9101 SEQ=3112103628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE91880000000001030307) Dec 6 04:31:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65184 DF PROTO=TCP SPT=45676 DPT=9105 SEQ=2029494251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE93650000000001030307) Dec 6 04:31:16 localhost systemd[1]: var-lib-containers-storage-overlay-f0eb3924bf90abe79532c5d63ba0c64dcad199823811b739cedec44dad043767-merged.mount: Deactivated successfully. Dec 6 04:31:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45487e39c92e1400b369c663be6072374515d7a57a0da87095a834cd0e4fd196-userdata-shm.mount: Deactivated successfully. Dec 6 04:31:16 localhost python3.9[137811]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:31:16 localhost systemd[1]: Reloading. Dec 6 04:31:16 localhost systemd-rc-local-generator[137837]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:31:16 localhost systemd-sysv-generator[137840]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:31:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:31:17 localhost python3.9[137936]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:31:17 localhost network[137953]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:31:17 localhost network[137954]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:31:17 localhost network[137955]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:31:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65186 DF PROTO=TCP SPT=45676 DPT=9105 SEQ=2029494251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAE9F870000000001030307) Dec 6 04:31:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:31:22 localhost python3.9[138157]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:22 localhost systemd[1]: Reloading. Dec 6 04:31:22 localhost systemd-rc-local-generator[138177]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:31:22 localhost systemd-sysv-generator[138184]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:31:22 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Dec 6 04:31:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64707 DF PROTO=TCP SPT=56482 DPT=9882 SEQ=3582759098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAEAF2F0000000001030307) Dec 6 04:31:23 localhost python3.9[138289]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:24 localhost python3.9[138382]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:25 localhost python3.9[138475]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33296 DF PROTO=TCP SPT=59840 DPT=9102 SEQ=4102946437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAEB9870000000001030307) Dec 6 04:31:26 localhost python3.9[138568]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:27 localhost python3.9[138661]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:27 localhost python3.9[138754]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:31:29 localhost python3.9[138847]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:29 localhost python3.9[138939]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36294 DF PROTO=TCP SPT=43500 DPT=9101 SEQ=2615309278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAECA370000000001030307) Dec 6 04:31:30 localhost python3.9[139031]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:31 localhost python3.9[139123]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36295 DF PROTO=TCP SPT=43500 DPT=9101 SEQ=2615309278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAECE470000000001030307) Dec 6 04:31:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:31:31 localhost podman[139216]: 2025-12-06 09:31:31.542645862 +0000 UTC m=+0.095448966 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, tcib_managed=true) Dec 6 04:31:31 localhost podman[139216]: 2025-12-06 09:31:31.608184178 +0000 UTC m=+0.160987222 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:31:31 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:31:31 localhost python3.9[139215]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:32 localhost python3.9[139332]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:32 localhost python3.9[139424]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:33 localhost python3.9[139516]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:34 localhost python3.9[139608]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5149 DF PROTO=TCP SPT=41712 DPT=9100 SEQ=2861710797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAEDB870000000001030307) Dec 6 04:31:34 localhost python3.9[139700]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:35 localhost python3.9[139792]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:31:35 localhost podman[139839]: 2025-12-06 09:31:35.552946703 +0000 UTC m=+0.078298636 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Dec 6 04:31:35 localhost podman[139839]: 2025-12-06 09:31:35.58828907 +0000 UTC m=+0.113640993 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent) Dec 6 04:31:35 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:31:35 localhost python3.9[139902]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:36 localhost python3.9[139994]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:37 localhost python3.9[140086]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:31:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19982 DF PROTO=TCP SPT=56182 DPT=9100 SEQ=906172445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAEE5870000000001030307) Dec 6 04:31:37 localhost python3.9[140178]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:38 localhost python3.9[140270]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:31:39 localhost python3.9[140362]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:31:39 localhost systemd[1]: Reloading. Dec 6 04:31:39 localhost systemd-rc-local-generator[140390]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:31:39 localhost systemd-sysv-generator[140393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:31:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:31:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34386 DF PROTO=TCP SPT=56556 DPT=9100 SEQ=1545815538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAEF1470000000001030307) Dec 6 04:31:40 localhost python3.9[140490]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:41 localhost python3.9[140583]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:42 localhost python3.9[140676]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:43 localhost python3.9[140769]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:43 localhost python3.9[140862]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:44 localhost python3.9[140955]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:45 localhost python3.9[141048]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:31:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36298 DF PROTO=TCP SPT=43500 DPT=9101 SEQ=2615309278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF05870000000001030307) Dec 6 04:31:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=117 DF PROTO=TCP SPT=40948 DPT=9105 SEQ=4278816680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF08940000000001030307) Dec 6 04:31:46 localhost python3.9[141141]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Dec 6 04:31:47 localhost python3.9[141234]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 6 04:31:48 localhost python3.9[141332]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548798.ooo.test update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Dec 6 04:31:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=119 DF PROTO=TCP SPT=40948 DPT=9105 SEQ=4278816680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF14870000000001030307) Dec 6 04:31:50 localhost python3.9[141432]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:31:50 localhost python3.9[141486]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:31:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=120 DF PROTO=TCP SPT=40948 DPT=9105 SEQ=4278816680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF24470000000001030307) Dec 6 04:31:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4420 DF PROTO=TCP SPT=44534 DPT=9102 SEQ=3732836371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF2D880000000001030307) Dec 6 04:31:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33297 DF PROTO=TCP SPT=59840 DPT=9102 SEQ=4102946437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF37880000000001030307) Dec 6 04:32:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22305 DF PROTO=TCP SPT=43354 DPT=9101 SEQ=2254165420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF43870000000001030307) Dec 6 04:32:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:32:02 localhost podman[141554]: 2025-12-06 09:32:02.561203862 +0000 UTC m=+0.081399172 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125) Dec 6 04:32:02 localhost podman[141554]: 2025-12-06 09:32:02.633243862 +0000 UTC m=+0.153439152 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller) Dec 6 04:32:02 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:32:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10577 DF PROTO=TCP SPT=41628 DPT=9101 SEQ=3112103628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF4F870000000001030307) Dec 6 04:32:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:32:06 localhost systemd[1]: tmp-crun.fTzVOL.mount: Deactivated successfully. Dec 6 04:32:06 localhost podman[141583]: 2025-12-06 09:32:06.551194963 +0000 UTC m=+0.084751687 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true) Dec 6 04:32:06 localhost podman[141583]: 2025-12-06 09:32:06.58623339 +0000 UTC m=+0.119790134 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent) Dec 6 04:32:06 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:32:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:32:06.653 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:32:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:32:06.654 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:32:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:32:06.655 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:32:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22307 DF PROTO=TCP SPT=43354 DPT=9101 SEQ=2254165420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF5B480000000001030307) Dec 6 04:32:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61476 DF PROTO=TCP SPT=56498 DPT=9100 SEQ=4149970072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF66870000000001030307) Dec 6 04:32:15 localhost kernel: SELinux: Converting 2788 SID table entries... Dec 6 04:32:15 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Dec 6 04:32:15 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:32:15 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:32:15 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:32:15 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:32:15 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:32:15 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:32:15 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:32:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22308 DF PROTO=TCP SPT=43354 DPT=9101 SEQ=2254165420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF7B870000000001030307) Dec 6 04:32:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41282 DF PROTO=TCP SPT=48648 DPT=9105 SEQ=3484442075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF7DC50000000001030307) Dec 6 04:32:17 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=20 res=1 Dec 6 04:32:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41284 DF PROTO=TCP SPT=48648 DPT=9105 SEQ=3484442075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF89C70000000001030307) Dec 6 04:32:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41285 DF PROTO=TCP SPT=48648 DPT=9105 SEQ=3484442075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAF99870000000001030307) Dec 6 04:32:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52058 DF PROTO=TCP SPT=35596 DPT=9102 SEQ=2948752655 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFA3870000000001030307) Dec 6 04:32:26 localhost kernel: SELinux: Converting 2793 SID table entries... Dec 6 04:32:26 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:32:26 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:32:26 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:32:26 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:32:26 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:32:26 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:32:26 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:32:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37986 DF PROTO=TCP SPT=54256 DPT=9101 SEQ=3350709855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFB4970000000001030307) Dec 6 04:32:30 localhost sshd[142628]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:32:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37987 DF PROTO=TCP SPT=54256 DPT=9101 SEQ=3350709855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFB8870000000001030307) Dec 6 04:32:33 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=21 res=1 Dec 6 04:32:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:32:33 localhost podman[142630]: 2025-12-06 09:32:33.556011762 +0000 UTC m=+0.079933071 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller) Dec 6 04:32:33 localhost podman[142630]: 2025-12-06 09:32:33.593240094 +0000 UTC m=+0.117161463 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:32:33 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:32:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61478 DF PROTO=TCP SPT=56498 DPT=9100 SEQ=4149970072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFC7880000000001030307) Dec 6 04:32:35 localhost kernel: SELinux: Converting 2793 SID table entries... Dec 6 04:32:35 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:32:35 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:32:35 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:32:35 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:32:35 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:32:35 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:32:35 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:32:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37989 DF PROTO=TCP SPT=54256 DPT=9101 SEQ=3350709855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFD0470000000001030307) Dec 6 04:32:37 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=22 res=1 Dec 6 04:32:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:32:37 localhost podman[142664]: 2025-12-06 09:32:37.555932437 +0000 UTC m=+0.079187814 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:32:37 localhost podman[142664]: 2025-12-06 09:32:37.593260823 +0000 UTC m=+0.116516210 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:32:37 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:32:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45880 DF PROTO=TCP SPT=38120 DPT=9100 SEQ=2557091045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFDBC70000000001030307) Dec 6 04:32:43 localhost kernel: SELinux: Converting 2793 SID table entries... Dec 6 04:32:43 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:32:43 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:32:43 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:32:43 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:32:43 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:32:43 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:32:43 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:32:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37990 DF PROTO=TCP SPT=54256 DPT=9101 SEQ=3350709855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFF1870000000001030307) Dec 6 04:32:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8790 DF PROTO=TCP SPT=41650 DPT=9105 SEQ=3402468018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFF2F50000000001030307) Dec 6 04:32:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8792 DF PROTO=TCP SPT=41650 DPT=9105 SEQ=3402468018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DAFFF080000000001030307) Dec 6 04:32:53 localhost kernel: SELinux: Converting 2794 SID table entries... Dec 6 04:32:53 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:32:53 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:32:53 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:32:53 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:32:53 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:32:53 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:32:53 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:32:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39354 DF PROTO=TCP SPT=46206 DPT=9882 SEQ=1257205126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB00EBE0000000001030307) Dec 6 04:32:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60530 DF PROTO=TCP SPT=50926 DPT=9102 SEQ=383753921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB017870000000001030307) Dec 6 04:33:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24970 DF PROTO=TCP SPT=37090 DPT=9101 SEQ=954368252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB029CB0000000001030307) Dec 6 04:33:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24971 DF PROTO=TCP SPT=37090 DPT=9101 SEQ=954368252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB02E0D0000000001030307) Dec 6 04:33:01 localhost kernel: SELinux: Converting 2794 SID table entries... Dec 6 04:33:01 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:33:01 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:33:01 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:33:01 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:33:01 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:33:01 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:33:01 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:33:02 localhost systemd[1]: Reloading. Dec 6 04:33:02 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=25 res=1 Dec 6 04:33:02 localhost systemd-rc-local-generator[142810]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:02 localhost systemd-sysv-generator[142813]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:02 localhost systemd[1]: Reloading. Dec 6 04:33:02 localhost systemd-rc-local-generator[142847]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:02 localhost systemd-sysv-generator[142850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22310 DF PROTO=TCP SPT=43354 DPT=9101 SEQ=2254165420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB039870000000001030307) Dec 6 04:33:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:33:04 localhost podman[142862]: 2025-12-06 09:33:04.555357026 +0000 UTC m=+0.080351456 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3) Dec 6 04:33:04 localhost podman[142862]: 2025-12-06 09:33:04.595141329 +0000 UTC m=+0.120135799 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:33:04 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:33:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:33:06.654 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:33:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:33:06.654 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:33:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:33:06.656 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:33:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61479 DF PROTO=TCP SPT=56498 DPT=9100 SEQ=4149970072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB045870000000001030307) Dec 6 04:33:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:33:08 localhost systemd[1]: tmp-crun.dHJXlN.mount: Deactivated successfully. Dec 6 04:33:08 localhost podman[142887]: 2025-12-06 09:33:08.558220465 +0000 UTC m=+0.090335143 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:33:08 localhost podman[142887]: 2025-12-06 09:33:08.563904118 +0000 UTC m=+0.096018806 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 04:33:08 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:33:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50038 DF PROTO=TCP SPT=56834 DPT=9100 SEQ=510802148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB050C70000000001030307) Dec 6 04:33:11 localhost kernel: SELinux: Converting 2795 SID table entries... Dec 6 04:33:11 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 6 04:33:11 localhost kernel: SELinux: policy capability open_perms=1 Dec 6 04:33:11 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 6 04:33:11 localhost kernel: SELinux: policy capability always_check_network=0 Dec 6 04:33:11 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 6 04:33:11 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 6 04:33:11 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 6 04:33:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24974 DF PROTO=TCP SPT=37090 DPT=9101 SEQ=954368252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB065870000000001030307) Dec 6 04:33:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24338 DF PROTO=TCP SPT=33904 DPT=9105 SEQ=749169810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB068250000000001030307) Dec 6 04:33:16 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 6 04:33:16 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=26 res=1 Dec 6 04:33:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24340 DF PROTO=TCP SPT=33904 DPT=9105 SEQ=749169810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB074470000000001030307) Dec 6 04:33:22 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:22 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:22 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:23 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:23 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32741 DF PROTO=TCP SPT=39388 DPT=9882 SEQ=2093421610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB083EE0000000001030307) Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39359 DF PROTO=TCP SPT=46206 DPT=9882 SEQ=1257205126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB08B880000000001030307) Dec 6 04:33:25 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 04:33:25 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 04:33:25 localhost systemd[1]: Reloading. Dec 6 04:33:25 localhost systemd-sysv-generator[143939]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:25 localhost systemd-rc-local-generator[143934]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:25 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 04:33:26 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 04:33:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5855 DF PROTO=TCP SPT=45960 DPT=9101 SEQ=2930427603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB09EF70000000001030307) Dec 6 04:33:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5856 DF PROTO=TCP SPT=45960 DPT=9101 SEQ=2930427603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0A3070000000001030307) Dec 6 04:33:32 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 04:33:32 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 04:33:32 localhost systemd[1]: man-db-cache-update.service: Consumed 5.783s CPU time. Dec 6 04:33:32 localhost systemd[1]: run-rd6009606d37040c19c0c878c126320fc.service: Deactivated successfully. Dec 6 04:33:32 localhost systemd[1]: run-rf9a305575908478ebb7246bfd0d10979.service: Deactivated successfully. Dec 6 04:33:33 localhost python3.9[150056]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:33:33 localhost systemd[1]: Reloading. Dec 6 04:33:33 localhost systemd-sysv-generator[150085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:33 localhost systemd-rc-local-generator[150080]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:33 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37992 DF PROTO=TCP SPT=54256 DPT=9101 SEQ=3350709855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0AF880000000001030307) Dec 6 04:33:34 localhost python3.9[150205]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:33:34 localhost systemd[1]: Reloading. Dec 6 04:33:34 localhost systemd-rc-local-generator[150232]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:34 localhost systemd-sysv-generator[150237]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:33:34 localhost podman[150245]: 2025-12-06 09:33:34.971639887 +0000 UTC m=+0.095738125 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller) Dec 6 04:33:35 localhost podman[150245]: 2025-12-06 09:33:35.013214451 +0000 UTC m=+0.137312719 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true) Dec 6 04:33:35 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:33:35 localhost python3.9[150379]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:33:35 localhost systemd[1]: Reloading. Dec 6 04:33:35 localhost systemd-rc-local-generator[150403]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:35 localhost systemd-sysv-generator[150409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:36 localhost python3.9[150527]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:33:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5858 DF PROTO=TCP SPT=45960 DPT=9101 SEQ=2930427603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0BAC70000000001030307) Dec 6 04:33:37 localhost systemd[1]: Reloading. Dec 6 04:33:37 localhost systemd-rc-local-generator[150554]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:37 localhost systemd-sysv-generator[150561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:33:39 localhost podman[150584]: 2025-12-06 09:33:39.565916391 +0000 UTC m=+0.093902364 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 04:33:39 localhost podman[150584]: 2025-12-06 09:33:39.601604052 +0000 UTC m=+0.129589995 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 04:33:39 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:33:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26510 DF PROTO=TCP SPT=45256 DPT=9100 SEQ=4083966983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0C6070000000001030307) Dec 6 04:33:40 localhost python3.9[150694]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:40 localhost systemd[1]: Reloading. Dec 6 04:33:40 localhost systemd-sysv-generator[150725]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:40 localhost systemd-rc-local-generator[150722]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost python3.9[150843]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:41 localhost systemd[1]: Reloading. Dec 6 04:33:41 localhost systemd-rc-local-generator[150868]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:41 localhost systemd-sysv-generator[150874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:41 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost python3.9[150991]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:43 localhost systemd[1]: Reloading. Dec 6 04:33:43 localhost systemd-sysv-generator[151026]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:43 localhost systemd-rc-local-generator[151022]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:44 localhost python3.9[151140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:45 localhost python3.9[151253]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5859 DF PROTO=TCP SPT=45960 DPT=9101 SEQ=2930427603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0DB880000000001030307) Dec 6 04:33:45 localhost systemd[1]: Reloading. Dec 6 04:33:45 localhost systemd-rc-local-generator[151282]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:45 localhost systemd-sysv-generator[151288]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64016 DF PROTO=TCP SPT=49362 DPT=9105 SEQ=3984235024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0DD560000000001030307) Dec 6 04:33:47 localhost python3.9[151403]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:33:48 localhost systemd[1]: Reloading. Dec 6 04:33:48 localhost systemd-rc-local-generator[151431]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:33:48 localhost systemd-sysv-generator[151434]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:33:48 localhost systemd[1]: Listening on libvirt proxy daemon socket. Dec 6 04:33:48 localhost systemd[1]: Listening on libvirt proxy daemon TLS IP socket. Dec 6 04:33:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64018 DF PROTO=TCP SPT=49362 DPT=9105 SEQ=3984235024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0E9470000000001030307) Dec 6 04:33:49 localhost python3.9[151555]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:51 localhost python3.9[151668]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:52 localhost python3.9[151781]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64019 DF PROTO=TCP SPT=49362 DPT=9105 SEQ=3984235024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB0F9070000000001030307) Dec 6 04:33:53 localhost python3.9[151894]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:55 localhost python3.9[152007]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17751 DF PROTO=TCP SPT=55550 DPT=9102 SEQ=1232860191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB103870000000001030307) Dec 6 04:33:56 localhost python3.9[152120]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:57 localhost python3.9[152233]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:58 localhost python3.9[152346]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:58 localhost python3.9[152459]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:33:59 localhost sshd[152534]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:33:59 localhost python3.9[152573]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:34:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29029 DF PROTO=TCP SPT=45412 DPT=9101 SEQ=3944685242 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB114270000000001030307) Dec 6 04:34:00 localhost python3.9[152686]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:34:01 localhost python3.9[152799]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:34:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29030 DF PROTO=TCP SPT=45412 DPT=9101 SEQ=3944685242 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB118470000000001030307) Dec 6 04:34:02 localhost python3.9[152912]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:34:03 localhost python3.9[153025]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 6 04:34:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26512 DF PROTO=TCP SPT=45256 DPT=9100 SEQ=4083966983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB125870000000001030307) Dec 6 04:34:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:34:05 localhost podman[153046]: 2025-12-06 09:34:05.545999307 +0000 UTC m=+0.078644288 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller) Dec 6 04:34:05 localhost podman[153046]: 2025-12-06 09:34:05.583105422 +0000 UTC m=+0.115750433 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Dec 6 04:34:05 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:34:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:34:06.654 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:34:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:34:06.655 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:34:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:34:06.656 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:34:06 localhost python3.9[153164]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:34:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50041 DF PROTO=TCP SPT=56834 DPT=9100 SEQ=510802148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB12F870000000001030307) Dec 6 04:34:07 localhost python3.9[153274]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:34:07 localhost python3.9[153384]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:34:08 localhost python3.9[153494]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:34:09 localhost python3.9[153604]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:34:09 localhost python3.9[153714]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:34:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58949 DF PROTO=TCP SPT=35364 DPT=9100 SEQ=3778148772 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB13B470000000001030307) Dec 6 04:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:34:10 localhost podman[153825]: 2025-12-06 09:34:10.437941614 +0000 UTC m=+0.083421947 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 04:34:10 localhost podman[153825]: 2025-12-06 09:34:10.441559098 +0000 UTC m=+0.087039491 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:34:10 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:34:10 localhost python3.9[153824]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:11 localhost python3.9[153931]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013649.9273815-1643-181410287567795/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:11 localhost python3.9[154041]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:12 localhost python3.9[154131]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013651.4702241-1643-84156481901152/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:13 localhost python3.9[154241]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:13 localhost python3.9[154331]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013652.603248-1643-189909519153837/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:14 localhost python3.9[154441]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:14 localhost python3.9[154531]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013653.6867013-1643-250623543050620/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29033 DF PROTO=TCP SPT=45412 DPT=9101 SEQ=3944685242 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB14F880000000001030307) Dec 6 04:34:15 localhost python3.9[154641]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21765 DF PROTO=TCP SPT=46976 DPT=9105 SEQ=857914173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB152850000000001030307) Dec 6 04:34:16 localhost python3.9[154731]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013655.4069026-1643-56509658234562/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:17 localhost python3.9[154841]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:18 localhost python3.9[154931]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013657.105144-1643-110600902309114/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:18 localhost python3.9[155041]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:19 localhost python3.9[155131]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013658.25951-1643-113542846541990/.source.conf follow=False _original_basename=auth.conf checksum=baf2d067e38e8c4c12a355eaf63b5326efc72396 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21767 DF PROTO=TCP SPT=46976 DPT=9105 SEQ=857914173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB15E870000000001030307) Dec 6 04:34:19 localhost python3.9[155241]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:20 localhost python3.9[155331]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765013659.3866627-1643-155627885127086/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:20 localhost python3.9[155441]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=m4JQidYnUsrccwakQIVjPibbA _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None Dec 6 04:34:21 localhost python3.9[155552]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:22 localhost python3.9[155662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:22 localhost python3.9[155772]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21768 DF PROTO=TCP SPT=46976 DPT=9105 SEQ=857914173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB16E480000000001030307) Dec 6 04:34:23 localhost python3.9[155882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:24 localhost python3.9[155992]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:24 localhost python3.9[156102]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3528 DF PROTO=TCP SPT=36652 DPT=9102 SEQ=1169431000 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB177870000000001030307) Dec 6 04:34:25 localhost python3.9[156212]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:26 localhost python3.9[156322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:27 localhost python3.9[156432]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17752 DF PROTO=TCP SPT=55550 DPT=9102 SEQ=1232860191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB181880000000001030307) Dec 6 04:34:28 localhost python3.9[156542]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:28 localhost python3.9[156652]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:29 localhost python3.9[156762]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:30 localhost python3.9[156872]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:30 localhost python3.9[156982]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11247 DF PROTO=TCP SPT=40234 DPT=9101 SEQ=2971603907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB18D470000000001030307) Dec 6 04:34:31 localhost python3.9[157092]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:31 localhost sshd[157109]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:34:32 localhost python3.9[157182]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013671.0908864-2306-109808726260096/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:32 localhost python3.9[157292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:33 localhost python3.9[157380]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013672.4783578-2306-4960905859391/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:33 localhost python3.9[157490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5861 DF PROTO=TCP SPT=45960 DPT=9101 SEQ=2930427603 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB199870000000001030307) Dec 6 04:34:34 localhost python3.9[157578]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013673.4455867-2306-196835999541806/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:34 localhost python3.9[157688]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:34:36 localhost podman[157777]: 2025-12-06 09:34:36.149015266 +0000 UTC m=+0.102396587 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:34:36 localhost python3.9[157776]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013674.4903514-2306-17532738137561/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:36 localhost podman[157777]: 2025-12-06 09:34:36.257320447 +0000 UTC m=+0.210701718 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller) Dec 6 04:34:36 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:34:36 localhost python3.9[157910]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11249 DF PROTO=TCP SPT=40234 DPT=9101 SEQ=2971603907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1A5080000000001030307) Dec 6 04:34:37 localhost python3.9[157998]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013676.3394892-2306-264056690839804/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:38 localhost python3.9[158108]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:38 localhost python3.9[158196]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013677.8618329-2306-144733859961381/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:39 localhost python3.9[158306]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:40 localhost python3.9[158394]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013678.9608324-2306-244775022827317/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21866 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=2229792109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1B0480000000001030307) Dec 6 04:34:40 localhost python3.9[158504]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:34:41 localhost podman[158592]: 2025-12-06 09:34:41.017257784 +0000 UTC m=+0.081991313 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:34:41 localhost podman[158592]: 2025-12-06 09:34:41.025258079 +0000 UTC m=+0.089991588 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:34:41 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:34:41 localhost python3.9[158593]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013680.1400023-2306-70919701232526/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:41 localhost python3.9[158720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:42 localhost python3.9[158808]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013681.2840524-2306-25350561528503/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:42 localhost python3.9[158918]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:43 localhost python3.9[159006]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013682.356874-2306-265873591645228/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:43 localhost python3.9[159116]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:44 localhost python3.9[159204]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013683.3961856-2306-156971657131849/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:44 localhost python3.9[159314]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:45 localhost python3.9[159402]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013684.4726634-2306-29088910577100/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11250 DF PROTO=TCP SPT=40234 DPT=9101 SEQ=2971603907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1C5870000000001030307) Dec 6 04:34:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3697 DF PROTO=TCP SPT=35470 DPT=9105 SEQ=1952184487 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1C7B50000000001030307) Dec 6 04:34:46 localhost python3.9[159512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:47 localhost python3.9[159600]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013685.5317018-2306-157459363076326/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:47 localhost python3.9[159710]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:34:48 localhost python3.9[159798]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013687.3177168-2306-75115136951709/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3699 DF PROTO=TCP SPT=35470 DPT=9105 SEQ=1952184487 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1D3C80000000001030307) Dec 6 04:34:50 localhost python3.9[159906]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:34:50 localhost python3.9[160019]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Dec 6 04:34:51 localhost python3.9[160129]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:52 localhost python3.9[160239]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:53 localhost python3.9[160349]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45047 DF PROTO=TCP SPT=42514 DPT=9882 SEQ=129392137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1E37E0000000001030307) Dec 6 04:34:53 localhost python3.9[160459]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:54 localhost python3.9[160569]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:54 localhost python3.9[160679]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:55 localhost python3.9[160789]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59776 DF PROTO=TCP SPT=47256 DPT=9102 SEQ=4266472916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1ED880000000001030307) Dec 6 04:34:56 localhost python3.9[160899]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:56 localhost python3.9[161009]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:58 localhost python3.9[161119]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:34:58 localhost python3.9[161229]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:34:58 localhost systemd[1]: Reloading. Dec 6 04:34:58 localhost systemd-rc-local-generator[161253]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:34:58 localhost systemd-sysv-generator[161258]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:34:59 localhost systemd[1]: Starting libvirt logging daemon socket... Dec 6 04:34:59 localhost systemd[1]: Listening on libvirt logging daemon socket. Dec 6 04:34:59 localhost systemd[1]: Starting libvirt logging daemon admin socket... Dec 6 04:34:59 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Dec 6 04:34:59 localhost systemd[1]: Starting libvirt logging daemon... Dec 6 04:34:59 localhost systemd[1]: Started libvirt logging daemon. Dec 6 04:34:59 localhost python3.9[161381]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:34:59 localhost systemd[1]: Reloading. Dec 6 04:35:00 localhost systemd-rc-local-generator[161404]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:35:00 localhost systemd-sysv-generator[161411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63357 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3216745641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB1FE870000000001030307) Dec 6 04:35:00 localhost systemd[1]: Starting libvirt nodedev daemon socket... Dec 6 04:35:00 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Dec 6 04:35:00 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Dec 6 04:35:00 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Dec 6 04:35:00 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Dec 6 04:35:00 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Dec 6 04:35:00 localhost systemd[1]: Started libvirt nodedev daemon. Dec 6 04:35:01 localhost python3.9[161555]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:35:01 localhost systemd[1]: Reloading. Dec 6 04:35:01 localhost systemd-sysv-generator[161583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:35:01 localhost systemd-rc-local-generator[161578]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63358 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3216745641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB202880000000001030307) Dec 6 04:35:01 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Dec 6 04:35:01 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Dec 6 04:35:01 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Dec 6 04:35:01 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Dec 6 04:35:01 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Dec 6 04:35:01 localhost systemd[1]: Started libvirt proxy daemon. Dec 6 04:35:01 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Dec 6 04:35:01 localhost setroubleshoot[161592]: Deleting alert e72bc9e7-2b90-4cf6-b52d-719ab939eaf0, it is allowed in current policy Dec 6 04:35:01 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Dec 6 04:35:02 localhost python3.9[161732]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:35:02 localhost systemd[1]: Reloading. Dec 6 04:35:02 localhost systemd-rc-local-generator[161759]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:35:02 localhost systemd-sysv-generator[161764]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:02 localhost systemd[1]: Listening on libvirt locking daemon socket. Dec 6 04:35:02 localhost systemd[1]: Starting libvirt QEMU daemon socket... Dec 6 04:35:02 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Dec 6 04:35:02 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Dec 6 04:35:02 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Dec 6 04:35:02 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Dec 6 04:35:02 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Dec 6 04:35:02 localhost systemd[1]: Started libvirt QEMU daemon. Dec 6 04:35:02 localhost setroubleshoot[161592]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 67b32f37-2d41-4208-8bbf-6e5a018a1a89 Dec 6 04:35:02 localhost setroubleshoot[161592]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Dec 6 04:35:02 localhost setroubleshoot[161592]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 67b32f37-2d41-4208-8bbf-6e5a018a1a89 Dec 6 04:35:02 localhost setroubleshoot[161592]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Dec 6 04:35:03 localhost python3.9[161917]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:35:03 localhost systemd[1]: Reloading. Dec 6 04:35:03 localhost systemd-sysv-generator[161954]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:35:03 localhost systemd-rc-local-generator[161949]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:03 localhost systemd[1]: Starting libvirt secret daemon socket... Dec 6 04:35:03 localhost systemd[1]: Listening on libvirt secret daemon socket. Dec 6 04:35:03 localhost systemd[1]: Starting libvirt secret daemon admin socket... Dec 6 04:35:03 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Dec 6 04:35:03 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Dec 6 04:35:03 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Dec 6 04:35:03 localhost systemd[1]: Started libvirt secret daemon. Dec 6 04:35:04 localhost python3.9[162099]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21868 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=2229792109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB211870000000001030307) Dec 6 04:35:05 localhost python3.9[162209]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:35:06 localhost python3.9[162319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:35:06 localhost podman[162336]: 2025-12-06 09:35:06.560071605 +0000 UTC m=+0.088776832 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 04:35:06 localhost podman[162336]: 2025-12-06 09:35:06.593751772 +0000 UTC m=+0.122456989 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:35:06 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:35:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:35:06.655 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:35:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:35:06.656 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:35:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:35:06.658 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:35:06 localhost python3.9[162432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013705.8934183-3341-233713669672660/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63360 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3216745641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB21A470000000001030307) Dec 6 04:35:07 localhost python3.9[162542]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:08 localhost python3.9[162652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:09 localhost python3.9[162709]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:09 localhost python3.9[162819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:10 localhost python3.9[162876]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.oogacl31 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13205 DF PROTO=TCP SPT=38786 DPT=9100 SEQ=3456975703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB225870000000001030307) Dec 6 04:35:11 localhost python3.9[162986]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:35:11 localhost podman[163044]: 2025-12-06 09:35:11.498675464 +0000 UTC m=+0.079675467 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent) Dec 6 04:35:11 localhost podman[163044]: 2025-12-06 09:35:11.533221926 +0000 UTC m=+0.114221959 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 6 04:35:11 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:35:11 localhost python3.9[163043]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:12 localhost python3.9[163170]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:35:12 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Dec 6 04:35:12 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Dec 6 04:35:13 localhost python3[163282]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Dec 6 04:35:13 localhost python3.9[163392]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:14 localhost python3.9[163449]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:14 localhost python3.9[163559]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:15 localhost python3.9[163616]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63361 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3216745641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB23B870000000001030307) Dec 6 04:35:16 localhost python3.9[163726]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22550 DF PROTO=TCP SPT=33908 DPT=9105 SEQ=2716535835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB23CE50000000001030307) Dec 6 04:35:16 localhost python3.9[163783]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:17 localhost python3.9[163893]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:17 localhost python3.9[163950]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:18 localhost python3.9[164060]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:18 localhost python3.9[164150]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013717.7909207-3716-272022318081542/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22552 DF PROTO=TCP SPT=33908 DPT=9105 SEQ=2716535835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB249070000000001030307) Dec 6 04:35:19 localhost python3.9[164260]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:20 localhost python3.9[164370]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:35:21 localhost python3.9[164483]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:22 localhost python3.9[164593]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:35:23 localhost python3.9[164704]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:35:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44501 DF PROTO=TCP SPT=60666 DPT=9882 SEQ=1077668606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB258AF0000000001030307) Dec 6 04:35:23 localhost python3.9[164816]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:35:24 localhost python3.9[164929]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:25 localhost python3.9[165039]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37302 DF PROTO=TCP SPT=46236 DPT=9102 SEQ=3226542933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB261870000000001030307) Dec 6 04:35:25 localhost python3.9[165127]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013724.808337-3932-90386953049364/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:26 localhost python3.9[165237]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:27 localhost python3.9[165325]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013726.0616455-3977-55653937424392/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:27 localhost python3.9[165435]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:35:28 localhost python3.9[165523]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013727.2229562-4022-84498503954349/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:29 localhost python3.9[165633]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:35:29 localhost systemd[1]: Reloading. Dec 6 04:35:29 localhost systemd-rc-local-generator[165655]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:35:29 localhost systemd-sysv-generator[165661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:29 localhost systemd[1]: Reached target edpm_libvirt.target. Dec 6 04:35:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50992 DF PROTO=TCP SPT=59312 DPT=9101 SEQ=2948210850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB273B80000000001030307) Dec 6 04:35:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50993 DF PROTO=TCP SPT=59312 DPT=9101 SEQ=2948210850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB277C70000000001030307) Dec 6 04:35:31 localhost python3.9[165783]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 6 04:35:31 localhost systemd[1]: Reloading. Dec 6 04:35:31 localhost systemd-sysv-generator[165812]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:35:31 localhost systemd-rc-local-generator[165807]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:31 localhost systemd[1]: Reloading. Dec 6 04:35:32 localhost systemd-rc-local-generator[165845]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:35:32 localhost systemd-sysv-generator[165851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:32 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:35:33 localhost systemd-logind[760]: Session 38 logged out. Waiting for processes to exit. Dec 6 04:35:33 localhost systemd[1]: session-38.scope: Deactivated successfully. Dec 6 04:35:33 localhost systemd[1]: session-38.scope: Consumed 3min 3.640s CPU time. Dec 6 04:35:33 localhost systemd-logind[760]: Removed session 38. Dec 6 04:35:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11252 DF PROTO=TCP SPT=40234 DPT=9101 SEQ=2971603907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB283870000000001030307) Dec 6 04:35:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50995 DF PROTO=TCP SPT=59312 DPT=9101 SEQ=2948210850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB28F870000000001030307) Dec 6 04:35:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:35:37 localhost podman[165876]: 2025-12-06 09:35:37.57206523 +0000 UTC m=+0.099086414 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:35:37 localhost podman[165876]: 2025-12-06 09:35:37.656490363 +0000 UTC m=+0.183511557 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:35:37 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:35:38 localhost sshd[165900]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:35:38 localhost systemd-logind[760]: New session 39 of user zuul. Dec 6 04:35:38 localhost systemd[1]: Started Session 39 of User zuul. Dec 6 04:35:39 localhost python3.9[166011]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:35:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21445 DF PROTO=TCP SPT=42818 DPT=9100 SEQ=1712101469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB29AC70000000001030307) Dec 6 04:35:40 localhost python3.9[166123]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:35:40 localhost network[166140]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:35:40 localhost network[166141]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:35:40 localhost network[166142]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:35:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:35:41 localhost podman[166163]: 2025-12-06 09:35:41.651110359 +0000 UTC m=+0.064943834 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 04:35:41 localhost podman[166163]: 2025-12-06 09:35:41.681973014 +0000 UTC m=+0.095806429 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:35:41 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:35:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:35:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50996 DF PROTO=TCP SPT=59312 DPT=9101 SEQ=2948210850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2AF870000000001030307) Dec 6 04:35:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14699 DF PROTO=TCP SPT=56024 DPT=9105 SEQ=3037296067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2B2150000000001030307) Dec 6 04:35:46 localhost python3.9[166394]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:35:47 localhost python3.9[166457]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:35:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14701 DF PROTO=TCP SPT=56024 DPT=9105 SEQ=3037296067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2BE070000000001030307) Dec 6 04:35:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14702 DF PROTO=TCP SPT=56024 DPT=9105 SEQ=3037296067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2CDC70000000001030307) Dec 6 04:35:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44506 DF PROTO=TCP SPT=60666 DPT=9882 SEQ=1077668606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2D5880000000001030307) Dec 6 04:35:55 localhost python3.9[166569]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:35:56 localhost python3.9[166681]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:35:57 localhost python3.9[166791]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:35:58 localhost python3.9[166902]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:35:58 localhost python3.9[167013]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:35:59 localhost python3.9[167124]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:36:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17268 DF PROTO=TCP SPT=50042 DPT=9101 SEQ=23040163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2E8E70000000001030307) Dec 6 04:36:00 localhost python3.9[167236]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17269 DF PROTO=TCP SPT=50042 DPT=9101 SEQ=23040163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2ED070000000001030307) Dec 6 04:36:01 localhost python3.9[167346]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:36:01 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Dec 6 04:36:04 localhost python3.9[167460]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:36:04 localhost systemd[1]: Reloading. Dec 6 04:36:04 localhost systemd-rc-local-generator[167487]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:36:04 localhost systemd-sysv-generator[167491]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63363 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3216745641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB2F9870000000001030307) Dec 6 04:36:04 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Dec 6 04:36:04 localhost systemd[1]: Starting Open-iSCSI... Dec 6 04:36:04 localhost iscsid[167501]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Dec 6 04:36:04 localhost iscsid[167501]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Dec 6 04:36:04 localhost iscsid[167501]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Dec 6 04:36:04 localhost iscsid[167501]: If using hardware iscsi like qla4xxx this message can be ignored. Dec 6 04:36:04 localhost iscsid[167501]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Dec 6 04:36:04 localhost iscsid[167501]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Dec 6 04:36:04 localhost iscsid[167501]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Dec 6 04:36:04 localhost systemd[1]: Started Open-iSCSI. Dec 6 04:36:04 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Dec 6 04:36:04 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Dec 6 04:36:06 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Dec 6 04:36:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:36:06.656 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:36:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:36:06.657 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:36:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:36:06.659 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:36:06 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Dec 6 04:36:06 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Dec 6 04:36:06 localhost python3.9[167613]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:36:06 localhost network[167643]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:36:06 localhost network[167644]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:36:06 localhost network[167645]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:36:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17271 DF PROTO=TCP SPT=50042 DPT=9101 SEQ=23040163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB304C80000000001030307) Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b29df016-5323-4ad8-87ae-441cc3e84a9c Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b29df016-5323-4ad8-87ae-441cc3e84a9c Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b29df016-5323-4ad8-87ae-441cc3e84a9c Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b29df016-5323-4ad8-87ae-441cc3e84a9c Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b29df016-5323-4ad8-87ae-441cc3e84a9c Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b29df016-5323-4ad8-87ae-441cc3e84a9c Dec 6 04:36:07 localhost setroubleshoot[167524]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 6 04:36:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:36:08 localhost podman[167654]: 2025-12-06 09:36:08.053279919 +0000 UTC m=+0.079178698 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:36:08 localhost podman[167654]: 2025-12-06 09:36:08.099214682 +0000 UTC m=+0.125113421 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Dec 6 04:36:08 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:36:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:36:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57340 DF PROTO=TCP SPT=43560 DPT=9100 SEQ=3883040421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB311870000000001030307) Dec 6 04:36:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:36:12 localhost podman[167813]: 2025-12-06 09:36:12.55570852 +0000 UTC m=+0.083344864 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:36:12 localhost podman[167813]: 2025-12-06 09:36:12.589258142 +0000 UTC m=+0.116894516 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:36:12 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:36:13 localhost python3.9[167923]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 6 04:36:14 localhost python3.9[168033]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Dec 6 04:36:15 localhost python3.9[168147]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17272 DF PROTO=TCP SPT=50042 DPT=9101 SEQ=23040163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB325880000000001030307) Dec 6 04:36:15 localhost python3.9[168235]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013774.8382726-455-245873141626976/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1089 DF PROTO=TCP SPT=52722 DPT=9105 SEQ=3421745597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB327450000000001030307) Dec 6 04:36:16 localhost python3.9[168345]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:17 localhost python3.9[168455]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:36:17 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 6 04:36:17 localhost systemd[1]: Stopped Load Kernel Modules. Dec 6 04:36:17 localhost systemd[1]: Stopping Load Kernel Modules... Dec 6 04:36:17 localhost systemd[1]: Starting Load Kernel Modules... Dec 6 04:36:17 localhost systemd-modules-load[168459]: Module 'msr' is built in Dec 6 04:36:17 localhost systemd[1]: Finished Load Kernel Modules. Dec 6 04:36:17 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Dec 6 04:36:17 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Dec 6 04:36:18 localhost python3.9[168569]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:36:19 localhost python3.9[168679]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:36:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1091 DF PROTO=TCP SPT=52722 DPT=9105 SEQ=3421745597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB333470000000001030307) Dec 6 04:36:19 localhost python3.9[168789]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:36:20 localhost python3.9[168899]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:21 localhost python3.9[168987]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013780.113469-629-162903149929787/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:21 localhost python3.9[169097]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:36:22 localhost python3.9[169208]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:23 localhost python3.9[169318]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:23 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Dec 6 04:36:23 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 04:36:23 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:36:23 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:36:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1092 DF PROTO=TCP SPT=52722 DPT=9105 SEQ=3421745597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB343070000000001030307) Dec 6 04:36:24 localhost python3.9[169429]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:25 localhost python3.9[169539]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58946 DF PROTO=TCP SPT=58732 DPT=9102 SEQ=2223770421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB34D870000000001030307) Dec 6 04:36:26 localhost python3.9[169649]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:26 localhost sshd[169760]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:36:26 localhost python3.9[169759]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:27 localhost python3.9[169871]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:28 localhost python3.9[169981]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:36:28 localhost python3.9[170093]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:29 localhost python3.9[170203]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:36:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22176 DF PROTO=TCP SPT=45398 DPT=9101 SEQ=1149477693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB35E170000000001030307) Dec 6 04:36:30 localhost python3.9[170313]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:30 localhost python3.9[170370]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:36:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22177 DF PROTO=TCP SPT=45398 DPT=9101 SEQ=1149477693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB362070000000001030307) Dec 6 04:36:31 localhost python3.9[170480]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:32 localhost python3.9[170537]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:36:32 localhost python3.9[170647]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:33 localhost python3.9[170757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:33 localhost python3.9[170814]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:34 localhost python3.9[170924]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57342 DF PROTO=TCP SPT=43560 DPT=9100 SEQ=3883040421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB371880000000001030307) Dec 6 04:36:35 localhost python3.9[170981]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:36 localhost python3.9[171091]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:36:36 localhost systemd[1]: Reloading. Dec 6 04:36:36 localhost systemd-sysv-generator[171121]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:36:36 localhost systemd-rc-local-generator[171114]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:36 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21448 DF PROTO=TCP SPT=42818 DPT=9100 SEQ=1712101469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB379880000000001030307) Dec 6 04:36:37 localhost python3.9[171238]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:37 localhost python3.9[171295]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:36:38 localhost podman[171406]: 2025-12-06 09:36:38.725020266 +0000 UTC m=+0.064198032 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:36:38 localhost podman[171406]: 2025-12-06 09:36:38.764369814 +0000 UTC m=+0.103547580 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:36:38 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:36:38 localhost python3.9[171405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:39 localhost python3.9[171487]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:40 localhost python3.9[171597]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:36:40 localhost systemd[1]: Reloading. Dec 6 04:36:40 localhost systemd-rc-local-generator[171623]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:36:40 localhost systemd-sysv-generator[171627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:36:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55981 DF PROTO=TCP SPT=59668 DPT=9100 SEQ=2743299997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB385070000000001030307) Dec 6 04:36:40 localhost systemd[1]: Starting Create netns directory... Dec 6 04:36:40 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 04:36:40 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 04:36:40 localhost systemd[1]: Finished Create netns directory. Dec 6 04:36:41 localhost python3.9[171748]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:36:41 localhost python3.9[171858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:42 localhost python3.9[171946]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013801.4337811-1250-277586786616399/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:36:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:36:43 localhost systemd[1]: tmp-crun.eNgf6Q.mount: Deactivated successfully. Dec 6 04:36:43 localhost podman[172057]: 2025-12-06 09:36:43.509019204 +0000 UTC m=+0.118811183 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:36:43 localhost podman[172057]: 2025-12-06 09:36:43.514726067 +0000 UTC m=+0.124518026 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:36:43 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:36:43 localhost python3.9[172056]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:36:44 localhost python3.9[172182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:36:44 localhost python3.9[172270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013803.7672303-1325-43100738491955/.source.json _original_basename=.6_cxauaa follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22180 DF PROTO=TCP SPT=45398 DPT=9101 SEQ=1149477693 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB399870000000001030307) Dec 6 04:36:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49861 DF PROTO=TCP SPT=51400 DPT=9105 SEQ=3682242111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB39C750000000001030307) Dec 6 04:36:46 localhost python3.9[172380]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:36:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49863 DF PROTO=TCP SPT=51400 DPT=9105 SEQ=3682242111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3A8870000000001030307) Dec 6 04:36:49 localhost python3.9[172688]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Dec 6 04:36:50 localhost python3.9[172798]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:36:51 localhost python3.9[172908]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 6 04:36:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26918 DF PROTO=TCP SPT=52764 DPT=9882 SEQ=2736088779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3B8400000000001030307) Dec 6 04:36:55 localhost python3[173045]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:36:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16905 DF PROTO=TCP SPT=60334 DPT=9102 SEQ=3439243574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3C1870000000001030307) Dec 6 04:36:57 localhost podman[173060]: 2025-12-06 09:36:55.754191921 +0000 UTC m=+0.044241987 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Dec 6 04:36:57 localhost podman[173109]: Dec 6 04:36:57 localhost podman[173109]: 2025-12-06 09:36:57.636269832 +0000 UTC m=+0.064920965 container create b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:36:57 localhost podman[173109]: 2025-12-06 09:36:57.598202898 +0000 UTC m=+0.026854091 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Dec 6 04:36:57 localhost python3[173045]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Dec 6 04:36:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58947 DF PROTO=TCP SPT=58732 DPT=9102 SEQ=2223770421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3CB870000000001030307) Dec 6 04:36:58 localhost python3.9[173254]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:36:59 localhost python3.9[173366]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:00 localhost python3.9[173421]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:37:00 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Dec 6 04:37:00 localhost python3.9[173531]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013820.0757635-1589-159793496025137/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:01 localhost python3.9[173586]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:37:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1816 DF PROTO=TCP SPT=48428 DPT=9101 SEQ=2935520763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3D7470000000001030307) Dec 6 04:37:01 localhost systemd[1]: Reloading. Dec 6 04:37:01 localhost systemd-rc-local-generator[173610]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:37:01 localhost systemd-sysv-generator[173616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:01 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Dec 6 04:37:02 localhost python3.9[173678]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:03 localhost systemd[1]: Reloading. Dec 6 04:37:03 localhost systemd-rc-local-generator[173705]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:37:03 localhost systemd-sysv-generator[173711]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:03 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Dec 6 04:37:03 localhost systemd[1]: Starting multipathd container... Dec 6 04:37:03 localhost systemd[1]: Started libcrun container. Dec 6 04:37:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/524921af210c02fdf193fb51de5ceda63cb949d9cbf1b15c17f3e72d28de9174/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 6 04:37:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/524921af210c02fdf193fb51de5ceda63cb949d9cbf1b15c17f3e72d28de9174/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 6 04:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:37:03 localhost podman[173720]: 2025-12-06 09:37:03.762062767 +0000 UTC m=+0.152344162 container init b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2) Dec 6 04:37:03 localhost multipathd[173736]: + sudo -E kolla_set_configs Dec 6 04:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:37:03 localhost podman[173720]: 2025-12-06 09:37:03.795959923 +0000 UTC m=+0.186241268 container start b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:37:03 localhost podman[173720]: multipathd Dec 6 04:37:03 localhost systemd[1]: Started multipathd container. Dec 6 04:37:03 localhost multipathd[173736]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:37:03 localhost multipathd[173736]: INFO:__main__:Validating config file Dec 6 04:37:03 localhost multipathd[173736]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:37:03 localhost multipathd[173736]: INFO:__main__:Writing out command to execute Dec 6 04:37:03 localhost multipathd[173736]: ++ cat /run_command Dec 6 04:37:03 localhost multipathd[173736]: + CMD='/usr/sbin/multipathd -d' Dec 6 04:37:03 localhost multipathd[173736]: + ARGS= Dec 6 04:37:03 localhost multipathd[173736]: + sudo kolla_copy_cacerts Dec 6 04:37:03 localhost podman[173743]: 2025-12-06 09:37:03.880019867 +0000 UTC m=+0.080336600 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:37:03 localhost multipathd[173736]: + [[ ! -n '' ]] Dec 6 04:37:03 localhost multipathd[173736]: + . kolla_extend_start Dec 6 04:37:03 localhost multipathd[173736]: Running command: '/usr/sbin/multipathd -d' Dec 6 04:37:03 localhost multipathd[173736]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Dec 6 04:37:03 localhost multipathd[173736]: + umask 0022 Dec 6 04:37:03 localhost multipathd[173736]: + exec /usr/sbin/multipathd -d Dec 6 04:37:03 localhost podman[173743]: 2025-12-06 09:37:03.888965723 +0000 UTC m=+0.089282476 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 04:37:03 localhost podman[173743]: unhealthy Dec 6 04:37:03 localhost multipathd[173736]: 10211.107145 | --------start up-------- Dec 6 04:37:03 localhost multipathd[173736]: 10211.107166 | read /etc/multipath.conf Dec 6 04:37:03 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:37:03 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Failed with result 'exit-code'. Dec 6 04:37:03 localhost multipathd[173736]: 10211.111178 | path checkers start up Dec 6 04:37:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17274 DF PROTO=TCP SPT=50042 DPT=9101 SEQ=23040163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3E3880000000001030307) Dec 6 04:37:04 localhost python3.9[173883]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:37:05 localhost python3.9[173995]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:05 localhost python3.9[174118]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:37:05 localhost systemd[1]: Stopping multipathd container... Dec 6 04:37:06 localhost multipathd[173736]: 10213.275698 | exit (signal) Dec 6 04:37:06 localhost multipathd[173736]: 10213.276790 | --------shut down------- Dec 6 04:37:06 localhost systemd[1]: libpod-b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.scope: Deactivated successfully. Dec 6 04:37:06 localhost podman[174122]: 2025-12-06 09:37:06.106432516 +0000 UTC m=+0.112906915 container died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Dec 6 04:37:06 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.timer: Deactivated successfully. Dec 6 04:37:06 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:37:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b-userdata-shm.mount: Deactivated successfully. Dec 6 04:37:06 localhost systemd[1]: var-lib-containers-storage-overlay-524921af210c02fdf193fb51de5ceda63cb949d9cbf1b15c17f3e72d28de9174-merged.mount: Deactivated successfully. Dec 6 04:37:06 localhost podman[174122]: 2025-12-06 09:37:06.353909323 +0000 UTC m=+0.360383662 container cleanup b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true) Dec 6 04:37:06 localhost podman[174122]: multipathd Dec 6 04:37:06 localhost podman[174149]: 2025-12-06 09:37:06.454550289 +0000 UTC m=+0.067818554 container cleanup b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd) Dec 6 04:37:06 localhost podman[174149]: multipathd Dec 6 04:37:06 localhost systemd[1]: edpm_multipathd.service: Deactivated successfully. Dec 6 04:37:06 localhost systemd[1]: Stopped multipathd container. Dec 6 04:37:06 localhost systemd[1]: Starting multipathd container... Dec 6 04:37:06 localhost systemd[1]: Started libcrun container. Dec 6 04:37:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/524921af210c02fdf193fb51de5ceda63cb949d9cbf1b15c17f3e72d28de9174/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 6 04:37:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/524921af210c02fdf193fb51de5ceda63cb949d9cbf1b15c17f3e72d28de9174/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 6 04:37:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:37:06 localhost podman[174162]: 2025-12-06 09:37:06.621297495 +0000 UTC m=+0.131233081 container init b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Dec 6 04:37:06 localhost multipathd[174177]: + sudo -E kolla_set_configs Dec 6 04:37:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:37:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:37:06.657 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:37:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:37:06.659 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:37:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:37:06.660 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:37:06 localhost podman[174162]: 2025-12-06 09:37:06.661471835 +0000 UTC m=+0.171407411 container start b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd) Dec 6 04:37:06 localhost podman[174162]: multipathd Dec 6 04:37:06 localhost systemd[1]: Started multipathd container. Dec 6 04:37:06 localhost multipathd[174177]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:37:06 localhost multipathd[174177]: INFO:__main__:Validating config file Dec 6 04:37:06 localhost multipathd[174177]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:37:06 localhost multipathd[174177]: INFO:__main__:Writing out command to execute Dec 6 04:37:06 localhost multipathd[174177]: ++ cat /run_command Dec 6 04:37:06 localhost multipathd[174177]: + CMD='/usr/sbin/multipathd -d' Dec 6 04:37:06 localhost multipathd[174177]: + ARGS= Dec 6 04:37:06 localhost multipathd[174177]: + sudo kolla_copy_cacerts Dec 6 04:37:06 localhost multipathd[174177]: + [[ ! -n '' ]] Dec 6 04:37:06 localhost multipathd[174177]: + . kolla_extend_start Dec 6 04:37:06 localhost multipathd[174177]: Running command: '/usr/sbin/multipathd -d' Dec 6 04:37:06 localhost multipathd[174177]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Dec 6 04:37:06 localhost multipathd[174177]: + umask 0022 Dec 6 04:37:06 localhost multipathd[174177]: + exec /usr/sbin/multipathd -d Dec 6 04:37:06 localhost multipathd[174177]: 10213.967535 | --------start up-------- Dec 6 04:37:06 localhost multipathd[174177]: 10213.967557 | read /etc/multipath.conf Dec 6 04:37:06 localhost multipathd[174177]: 10213.972307 | path checkers start up Dec 6 04:37:06 localhost podman[174184]: 2025-12-06 09:37:06.75983133 +0000 UTC m=+0.092344771 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 04:37:06 localhost podman[174184]: 2025-12-06 09:37:06.769165488 +0000 UTC m=+0.101678879 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0) Dec 6 04:37:06 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:37:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1818 DF PROTO=TCP SPT=48428 DPT=9101 SEQ=2935520763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3EF080000000001030307) Dec 6 04:37:08 localhost python3.9[174326]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:37:09 localhost podman[174382]: 2025-12-06 09:37:09.557174678 +0000 UTC m=+0.081425324 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 04:37:09 localhost podman[174382]: 2025-12-06 09:37:09.62239941 +0000 UTC m=+0.146650086 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 04:37:09 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:37:10 localhost python3.9[174462]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 6 04:37:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59538 DF PROTO=TCP SPT=43116 DPT=9100 SEQ=3013635061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB3FA470000000001030307) Dec 6 04:37:10 localhost python3.9[174572]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Dec 6 04:37:11 localhost python3.9[174690]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:37:12 localhost python3.9[174778]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013831.1640596-1829-170529974634564/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:12 localhost python3.9[174888]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:13 localhost python3.9[174998]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:37:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:37:13 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 6 04:37:13 localhost systemd[1]: Stopped Load Kernel Modules. Dec 6 04:37:13 localhost systemd[1]: Stopping Load Kernel Modules... Dec 6 04:37:13 localhost systemd[1]: Starting Load Kernel Modules... Dec 6 04:37:13 localhost systemd-modules-load[175009]: Module 'msr' is built in Dec 6 04:37:13 localhost systemd[1]: Finished Load Kernel Modules. Dec 6 04:37:13 localhost podman[175000]: 2025-12-06 09:37:13.857208439 +0000 UTC m=+0.095030944 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 04:37:13 localhost podman[175000]: 2025-12-06 09:37:13.868212639 +0000 UTC m=+0.106035124 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Dec 6 04:37:13 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:37:14 localhost python3.9[175130]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:37:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1819 DF PROTO=TCP SPT=48428 DPT=9101 SEQ=2935520763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB40F870000000001030307) Dec 6 04:37:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3131 DF PROTO=TCP SPT=32832 DPT=9105 SEQ=1867094525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB411A50000000001030307) Dec 6 04:37:18 localhost systemd[1]: Reloading. Dec 6 04:37:18 localhost systemd-rc-local-generator[175164]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:37:18 localhost systemd-sysv-generator[175170]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: Reloading. Dec 6 04:37:18 localhost systemd-sysv-generator[175204]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:37:18 localhost systemd-rc-local-generator[175201]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:18 localhost systemd-logind[760]: Watching system buttons on /dev/input/event0 (Power Button) Dec 6 04:37:18 localhost systemd-logind[760]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Dec 6 04:37:18 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 6 04:37:18 localhost systemd[1]: Starting man-db-cache-update.service... Dec 6 04:37:18 localhost systemd[1]: Reloading. Dec 6 04:37:19 localhost systemd-rc-local-generator[175293]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:37:19 localhost systemd-sysv-generator[175297]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:19 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 6 04:37:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3133 DF PROTO=TCP SPT=32832 DPT=9105 SEQ=1867094525 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB41DC80000000001030307) Dec 6 04:37:19 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 6 04:37:19 localhost systemd[1]: Finished man-db-cache-update.service. Dec 6 04:37:19 localhost systemd[1]: man-db-cache-update.service: Consumed 1.066s CPU time. Dec 6 04:37:19 localhost systemd[1]: run-r1a534251d81646d9baf0f04454ddd8fa.service: Deactivated successfully. Dec 6 04:37:21 localhost python3.9[176542]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:37:22 localhost python3.9[176656]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:23 localhost python3.9[176766]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:37:23 localhost systemd[1]: Reloading. Dec 6 04:37:23 localhost systemd-sysv-generator[176790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:37:23 localhost systemd-rc-local-generator[176787]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:37:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50069 DF PROTO=TCP SPT=41032 DPT=9882 SEQ=1060774817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB42D6E0000000001030307) Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:23 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:24 localhost python3.9[176910]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:37:24 localhost network[176927]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:37:24 localhost network[176928]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:37:24 localhost network[176929]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:37:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51999 DF PROTO=TCP SPT=34122 DPT=9102 SEQ=799464198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB437880000000001030307) Dec 6 04:37:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27379 DF PROTO=TCP SPT=40374 DPT=9101 SEQ=439588984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB448770000000001030307) Dec 6 04:37:30 localhost python3.9[177165]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27380 DF PROTO=TCP SPT=40374 DPT=9101 SEQ=439588984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB44C880000000001030307) Dec 6 04:37:32 localhost python3.9[177276]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:33 localhost python3.9[177387]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:34 localhost python3.9[177498]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:35 localhost python3.9[177609]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59540 DF PROTO=TCP SPT=43116 DPT=9100 SEQ=3013635061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB45B870000000001030307) Dec 6 04:37:35 localhost python3.9[177720]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:36 localhost python3.9[177831]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:37:36 localhost podman[177943]: 2025-12-06 09:37:36.960804579 +0000 UTC m=+0.083737626 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 04:37:36 localhost podman[177943]: 2025-12-06 09:37:36.997242403 +0000 UTC m=+0.120175380 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 04:37:37 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:37:37 localhost python3.9[177942]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:37:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27382 DF PROTO=TCP SPT=40374 DPT=9101 SEQ=439588984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB464470000000001030307) Dec 6 04:37:38 localhost python3.9[178073]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:38 localhost python3.9[178183]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:39 localhost python3.9[178293]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:37:39 localhost podman[178404]: 2025-12-06 09:37:39.892356367 +0000 UTC m=+0.082700442 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller) Dec 6 04:37:39 localhost podman[178404]: 2025-12-06 09:37:39.93909472 +0000 UTC m=+0.129438785 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 6 04:37:39 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:37:39 localhost python3.9[178403]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52877 DF PROTO=TCP SPT=32926 DPT=9100 SEQ=2749801707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB46F880000000001030307) Dec 6 04:37:40 localhost python3.9[178539]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:41 localhost python3.9[178649]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:42 localhost python3.9[178759]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:42 localhost python3.9[178869]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:44 localhost python3.9[178979]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:37:44 localhost systemd[1]: tmp-crun.TqqnUk.mount: Deactivated successfully. Dec 6 04:37:44 localhost podman[179089]: 2025-12-06 09:37:44.555840524 +0000 UTC m=+0.085999075 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:37:44 localhost podman[179089]: 2025-12-06 09:37:44.565222063 +0000 UTC m=+0.095357153 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:37:44 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:37:44 localhost python3.9[179090]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:45 localhost python3.9[179217]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:45 localhost python3.9[179327]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27383 DF PROTO=TCP SPT=40374 DPT=9101 SEQ=439588984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB485870000000001030307) Dec 6 04:37:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29520 DF PROTO=TCP SPT=43852 DPT=9105 SEQ=4142371317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB486D60000000001030307) Dec 6 04:37:46 localhost python3.9[179437]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:46 localhost python3.9[179547]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:47 localhost python3.9[179657]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:48 localhost python3.9[179767]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:37:49 localhost python3.9[179877]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29522 DF PROTO=TCP SPT=43852 DPT=9105 SEQ=4142371317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB492C70000000001030307) Dec 6 04:37:49 localhost python3.9[179987]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:37:50 localhost python3.9[180097]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:37:50 localhost systemd[1]: Reloading. Dec 6 04:37:50 localhost systemd-rc-local-generator[180122]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:37:50 localhost systemd-sysv-generator[180127]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:51 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:37:52 localhost python3.9[180243]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29523 DF PROTO=TCP SPT=43852 DPT=9105 SEQ=4142371317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4A2870000000001030307) Dec 6 04:37:53 localhost python3.9[180354]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:53 localhost python3.9[180465]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:54 localhost python3.9[180576]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:55 localhost python3.9[180687]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41532 DF PROTO=TCP SPT=49076 DPT=9102 SEQ=2716656950 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4AB880000000001030307) Dec 6 04:37:55 localhost python3.9[180798]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:57 localhost python3.9[180909]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:57 localhost python3.9[181020]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:37:59 localhost python3.9[181131]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51776 DF PROTO=TCP SPT=54354 DPT=9101 SEQ=801853975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4BDA70000000001030307) Dec 6 04:38:00 localhost python3.9[181241]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:01 localhost python3.9[181351]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51777 DF PROTO=TCP SPT=54354 DPT=9101 SEQ=801853975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4C1C80000000001030307) Dec 6 04:38:02 localhost python3.9[181461]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:02 localhost python3.9[181571]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:03 localhost python3.9[181681]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:03 localhost python3.9[181791]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1821 DF PROTO=TCP SPT=48428 DPT=9101 SEQ=2935520763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4CD880000000001030307) Dec 6 04:38:04 localhost python3.9[181901]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:05 localhost python3.9[182011]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:05 localhost python3.9[182121]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:38:06.658 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:38:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:38:06.659 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:38:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:38:06.661 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:38:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59541 DF PROTO=TCP SPT=43116 DPT=9100 SEQ=3013635061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4D9870000000001030307) Dec 6 04:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:38:07 localhost podman[182139]: 2025-12-06 09:38:07.558497717 +0000 UTC m=+0.090507951 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd) Dec 6 04:38:07 localhost podman[182139]: 2025-12-06 09:38:07.572452659 +0000 UTC m=+0.104462933 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:38:07 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:38:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21982 DF PROTO=TCP SPT=42970 DPT=9100 SEQ=1064973763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4E4C70000000001030307) Dec 6 04:38:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:38:10 localhost podman[182159]: 2025-12-06 09:38:10.588001836 +0000 UTC m=+0.122973050 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 04:38:10 localhost podman[182159]: 2025-12-06 09:38:10.654832064 +0000 UTC m=+0.189803278 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:38:10 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:38:11 localhost python3.9[182276]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Dec 6 04:38:12 localhost python3.9[182387]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 6 04:38:13 localhost python3.9[182503]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548798.ooo.test update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Dec 6 04:38:14 localhost sshd[182529]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:38:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:38:14 localhost systemd-logind[760]: New session 40 of user zuul. Dec 6 04:38:14 localhost systemd[1]: Started Session 40 of User zuul. Dec 6 04:38:14 localhost podman[182531]: 2025-12-06 09:38:14.971589108 +0000 UTC m=+0.081213716 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 04:38:15 localhost podman[182531]: 2025-12-06 09:38:15.003006365 +0000 UTC m=+0.112630983 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 04:38:15 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:38:15 localhost systemd[1]: session-40.scope: Deactivated successfully. Dec 6 04:38:15 localhost systemd-logind[760]: Session 40 logged out. Waiting for processes to exit. Dec 6 04:38:15 localhost systemd-logind[760]: Removed session 40. Dec 6 04:38:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51780 DF PROTO=TCP SPT=54354 DPT=9101 SEQ=801853975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4F9880000000001030307) Dec 6 04:38:15 localhost python3.9[182658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24715 DF PROTO=TCP SPT=58190 DPT=9105 SEQ=1526083756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB4FC050000000001030307) Dec 6 04:38:16 localhost python3.9[182744]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013895.2617335-3368-280756885907072/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:16 localhost python3.9[182852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:17 localhost python3.9[182907]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:17 localhost python3.9[183015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:18 localhost python3.9[183101]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013897.3670769-3368-103320822763770/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:19 localhost python3.9[183209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24717 DF PROTO=TCP SPT=58190 DPT=9105 SEQ=1526083756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB508070000000001030307) Dec 6 04:38:19 localhost python3.9[183295]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013898.4478872-3368-48287077806495/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=4a38887ad65f37f06de8a6f0571c8572a75472b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:20 localhost python3.9[183403]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:20 localhost python3.9[183489]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013899.841908-3368-241356968855146/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:21 localhost python3.9[183597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:21 localhost python3.9[183683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013900.900649-3368-168834398676572/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:22 localhost python3.9[183793]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:38:23 localhost python3.9[183903]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:38:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24718 DF PROTO=TCP SPT=58190 DPT=9105 SEQ=1526083756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB517C80000000001030307) Dec 6 04:38:23 localhost python3.9[184013]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:38:24 localhost python3.9[184125]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:38:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56844 DF PROTO=TCP SPT=40794 DPT=9882 SEQ=139562569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB51F870000000001030307) Dec 6 04:38:25 localhost python3.9[184233]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:38:26 localhost python3.9[184343]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:26 localhost python3.9[184429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013905.721446-3743-86922507726717/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:27 localhost sshd[184501]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:38:27 localhost python3.9[184539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:38:28 localhost python3.9[184625]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013907.407351-3788-159442871485351/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:38:29 localhost python3.9[184735]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Dec 6 04:38:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55712 DF PROTO=TCP SPT=37250 DPT=9101 SEQ=608649544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB532D70000000001030307) Dec 6 04:38:30 localhost python3.9[184845]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:38:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55713 DF PROTO=TCP SPT=37250 DPT=9101 SEQ=608649544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB536C70000000001030307) Dec 6 04:38:31 localhost python3[184955]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:38:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27385 DF PROTO=TCP SPT=40374 DPT=9101 SEQ=439588984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB543870000000001030307) Dec 6 04:38:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55715 DF PROTO=TCP SPT=37250 DPT=9101 SEQ=608649544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB54E870000000001030307) Dec 6 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:38:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43400 DF PROTO=TCP SPT=53802 DPT=9100 SEQ=1794764078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB559C70000000001030307) Dec 6 04:38:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:38:42 localhost podman[185007]: 2025-12-06 09:38:42.903552555 +0000 UTC m=+4.576640715 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125) Dec 6 04:38:42 localhost podman[185007]: 2025-12-06 09:38:42.919241933 +0000 UTC m=+4.592330163 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 04:38:42 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:38:42 localhost podman[184970]: 2025-12-06 09:38:31.496527795 +0000 UTC m=+0.035806506 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Dec 6 04:38:43 localhost podman[185020]: 2025-12-06 09:38:43.016355332 +0000 UTC m=+1.440253888 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125) Dec 6 04:38:43 localhost podman[185020]: 2025-12-06 09:38:43.05319356 +0000 UTC m=+1.477092146 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible) Dec 6 04:38:43 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:38:43 localhost podman[185077]: Dec 6 04:38:43 localhost podman[185077]: 2025-12-06 09:38:43.176762969 +0000 UTC m=+0.090159921 container create 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 04:38:43 localhost podman[185077]: 2025-12-06 09:38:43.134152847 +0000 UTC m=+0.047549859 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Dec 6 04:38:43 localhost python3[184955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Dec 6 04:38:43 localhost python3.9[185224]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:38:45 localhost python3.9[185336]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Dec 6 04:38:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:38:45 localhost podman[185354]: 2025-12-06 09:38:45.588364324 +0000 UTC m=+0.112962193 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 6 04:38:45 localhost podman[185354]: 2025-12-06 09:38:45.620299457 +0000 UTC m=+0.144897356 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 04:38:45 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:38:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55716 DF PROTO=TCP SPT=37250 DPT=9101 SEQ=608649544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB56F870000000001030307) Dec 6 04:38:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32622 DF PROTO=TCP SPT=59208 DPT=9105 SEQ=3450820357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB571350000000001030307) Dec 6 04:38:46 localhost python3.9[185462]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:38:47 localhost python3[185572]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:38:47 localhost python3[185572]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",#012 "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-01T06:31:10.62653219Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211779450,#012 "VirtualSize": 1211779450,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",#012 "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",#012 "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",#012 "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-01T06:09:28.025707917Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025744608Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025767729Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025791379Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.02581523Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025867611Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.469442331Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:02.029095017Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Dec 6 04:38:47 localhost podman[185623]: 2025-12-06 09:38:47.826009655 +0000 UTC m=+0.115252686 container remove 3d4b499a7c3a2b55794a82c3d23218bdd4391e5e9b3a6e8b5192bc3a275ceba6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '4e7ada4fab3991cc27fb5f75a09b7e0f-f81b1d391c9b63868054d7733e636be7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute) Dec 6 04:38:47 localhost python3[185572]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Dec 6 04:38:47 localhost podman[185637]: Dec 6 04:38:47 localhost podman[185637]: 2025-12-06 09:38:47.992049589 +0000 UTC m=+0.148267831 container create 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 04:38:47 localhost podman[185637]: 2025-12-06 09:38:47.895383915 +0000 UTC m=+0.051602247 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Dec 6 04:38:47 localhost python3[185572]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Dec 6 04:38:48 localhost python3.9[185783]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:38:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32624 DF PROTO=TCP SPT=59208 DPT=9105 SEQ=3450820357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB57D470000000001030307) Dec 6 04:38:49 localhost python3.9[185895]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:38:50 localhost python3.9[186004]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013930.1071389-4064-26640266801661/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:38:51 localhost python3.9[186059]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:38:51 localhost systemd[1]: Reloading. Dec 6 04:38:51 localhost systemd-rc-local-generator[186087]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:38:51 localhost systemd-sysv-generator[186091]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:51 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost python3.9[186151]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:38:52 localhost systemd[1]: Reloading. Dec 6 04:38:52 localhost systemd-rc-local-generator[186180]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:38:52 localhost systemd-sysv-generator[186183]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:38:52 localhost systemd[1]: Starting nova_compute container... Dec 6 04:38:52 localhost systemd[1]: Started libcrun container. Dec 6 04:38:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Dec 6 04:38:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 6 04:38:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 04:38:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 04:38:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 6 04:38:52 localhost podman[186191]: 2025-12-06 09:38:52.521096475 +0000 UTC m=+0.115968058 container init 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:38:52 localhost podman[186191]: 2025-12-06 09:38:52.530188903 +0000 UTC m=+0.125060476 container start 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:38:52 localhost podman[186191]: nova_compute Dec 6 04:38:52 localhost systemd[1]: Started nova_compute container. Dec 6 04:38:52 localhost nova_compute[186205]: + sudo -E kolla_set_configs Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Validating config file Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying service configuration files Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Deleting /etc/nova/nova.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /etc/nova/nova.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Deleting /etc/ceph Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Creating directory /etc/ceph Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /etc/ceph Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Deleting /usr/sbin/iscsiadm Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Writing out command to execute Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:38:52 localhost nova_compute[186205]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 6 04:38:52 localhost nova_compute[186205]: ++ cat /run_command Dec 6 04:38:52 localhost nova_compute[186205]: + CMD=nova-compute Dec 6 04:38:52 localhost nova_compute[186205]: + ARGS= Dec 6 04:38:52 localhost nova_compute[186205]: + sudo kolla_copy_cacerts Dec 6 04:38:52 localhost nova_compute[186205]: Running command: 'nova-compute' Dec 6 04:38:52 localhost nova_compute[186205]: + [[ ! -n '' ]] Dec 6 04:38:52 localhost nova_compute[186205]: + . kolla_extend_start Dec 6 04:38:52 localhost nova_compute[186205]: + echo 'Running command: '\''nova-compute'\''' Dec 6 04:38:52 localhost nova_compute[186205]: + umask 0022 Dec 6 04:38:52 localhost nova_compute[186205]: + exec nova-compute Dec 6 04:38:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52514 DF PROTO=TCP SPT=56096 DPT=9882 SEQ=431551764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB58CFE0000000001030307) Dec 6 04:38:54 localhost nova_compute[186205]: 2025-12-06 09:38:54.330 186209 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:38:54 localhost nova_compute[186205]: 2025-12-06 09:38:54.330 186209 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:38:54 localhost nova_compute[186205]: 2025-12-06 09:38:54.330 186209 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:38:54 localhost nova_compute[186205]: 2025-12-06 09:38:54.330 186209 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Dec 6 04:38:54 localhost nova_compute[186205]: 2025-12-06 09:38:54.450 186209 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:38:54 localhost nova_compute[186205]: 2025-12-06 09:38:54.460 186209 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:38:54 localhost nova_compute[186205]: 2025-12-06 09:38:54.460 186209 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.033 186209 INFO nova.virt.driver [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Dec 6 04:38:55 localhost python3.9[186329]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.150 186209 INFO nova.compute.provider_config [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.167 186209 WARNING nova.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.167 186209 DEBUG oslo_concurrency.lockutils [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.168 186209 DEBUG oslo_concurrency.lockutils [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.168 186209 DEBUG oslo_concurrency.lockutils [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.168 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.168 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.168 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.168 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.169 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.169 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.169 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.169 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.169 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.169 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.169 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.170 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.170 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.170 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.170 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.170 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.170 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.171 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.171 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.171 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] console_host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.171 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.171 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.172 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.172 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.172 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.172 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.172 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.173 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.173 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.173 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.173 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.173 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.174 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.174 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.174 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.174 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.175 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.175 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.175 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.175 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.175 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.176 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.176 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.176 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.176 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.176 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.177 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.177 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.177 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.177 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.177 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.178 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.178 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.178 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.178 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.178 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.178 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.179 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.179 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.179 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.179 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.179 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.180 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.180 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.180 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.180 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.180 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.180 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.181 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.181 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.181 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.181 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.181 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.182 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.182 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.182 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.182 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.182 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.182 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.183 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.183 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.183 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.183 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.183 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.184 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.184 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.184 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.184 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.184 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.185 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.185 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.185 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.185 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.185 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.185 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.186 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.186 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.186 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.186 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.186 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.187 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.187 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.187 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.187 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.187 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.187 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.188 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.188 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.188 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.188 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.188 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.189 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.189 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.189 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.189 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.189 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.190 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.190 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.190 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.190 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.190 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.191 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.191 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.191 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.191 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.191 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.192 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.192 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.192 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.192 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.192 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.193 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.193 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.193 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.193 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.193 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.194 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.194 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.194 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.194 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.194 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.194 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.195 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.195 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.195 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.195 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.195 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.195 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.195 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.196 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.196 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.196 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.196 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.196 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.196 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.196 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.197 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.197 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.197 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.197 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.197 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.197 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.198 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.198 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.198 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.198 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.198 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.198 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.198 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.199 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.199 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.199 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.199 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.199 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.199 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.199 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.200 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.200 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.200 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.200 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.200 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.200 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.201 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.201 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.201 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.201 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.201 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.201 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.201 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.202 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.202 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.202 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.202 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.202 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.202 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.202 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.203 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.203 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.203 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.203 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.203 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.203 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.204 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.204 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.204 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.204 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.204 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.204 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.204 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.205 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.205 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.205 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.205 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.205 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.205 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.205 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.206 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.207 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.208 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.208 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.208 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.208 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.208 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.208 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.208 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.209 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.209 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.209 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.209 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.209 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.209 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.209 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.210 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.211 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.211 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.211 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.211 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.211 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.211 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.211 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.212 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.213 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.213 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.213 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.213 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.213 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.213 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.213 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.214 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.214 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.214 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.214 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.214 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.214 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.214 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.215 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.215 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.215 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.215 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.215 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.215 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.215 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.216 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.217 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.217 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.217 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.217 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.217 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.217 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.217 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.218 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.218 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.218 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.218 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.218 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.218 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.218 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.219 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.219 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.219 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.219 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.219 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.219 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.220 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.220 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.220 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.220 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.220 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.220 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.220 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.221 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.221 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.221 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.221 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.221 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.221 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.221 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.222 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.222 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.222 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.222 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.222 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.222 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.222 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.223 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.224 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.225 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.225 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.225 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.225 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.225 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.225 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.226 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.227 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.227 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.227 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.227 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.227 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.227 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.227 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.228 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.229 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.229 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.229 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.229 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.229 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.230 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.230 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.230 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.230 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.230 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.230 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.230 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.231 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.231 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.231 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.231 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.231 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.231 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.231 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.232 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.233 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.233 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.233 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.233 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.233 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.233 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.233 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.234 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.234 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.234 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.234 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.234 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.234 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.234 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.235 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.235 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.235 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.235 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.235 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.235 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.235 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.236 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.237 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.237 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.237 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.237 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.237 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.237 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.238 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.239 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.239 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.239 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.239 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.239 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.239 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.239 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.240 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.images_rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.241 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.241 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.241 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.images_rbd_glance_store_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.241 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.images_rbd_pool = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.241 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.images_type = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.241 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.241 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.242 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.242 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.242 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.242 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.242 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.242 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.242 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.243 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.244 186209 WARNING oslo_config.cfg [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Dec 6 04:38:55 localhost nova_compute[186205]: live_migration_uri is deprecated for removal in favor of two other options that Dec 6 04:38:55 localhost nova_compute[186205]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Dec 6 04:38:55 localhost nova_compute[186205]: and ``live_migration_inbound_addr`` respectively. Dec 6 04:38:55 localhost nova_compute[186205]: ). Its value may be silently ignored in the future.#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.244 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_uri = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.244 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.244 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.244 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.244 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.245 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.245 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.245 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.245 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.245 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.245 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.246 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.246 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.246 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.246 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.246 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.246 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.246 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.247 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rbd_secret_uuid = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.247 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rbd_user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.247 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.247 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.247 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.247 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.247 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.248 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.248 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.248 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.248 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.248 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.248 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.249 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.249 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.249 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.249 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.249 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.249 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.249 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.250 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.251 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.251 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.251 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.251 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.251 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.251 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.251 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.252 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.253 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.253 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.253 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.253 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.253 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.253 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.254 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.254 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.254 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.254 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.254 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.254 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.254 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.255 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.256 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.256 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.256 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.256 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.256 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.256 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.256 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.257 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.257 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.auth_url = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.257 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.257 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.257 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.257 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.258 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.258 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.258 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.258 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.258 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.258 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.258 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.259 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.260 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.260 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.260 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.260 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.260 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.260 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.260 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.261 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.261 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.261 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.261 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.261 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.261 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.261 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.262 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.263 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.263 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.263 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.263 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.263 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.263 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.263 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.264 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.264 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.264 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.264 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.264 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.265 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.265 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.265 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.265 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.265 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.265 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.265 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.266 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.266 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.266 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.266 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.266 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.266 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.266 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.267 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.268 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.268 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.268 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.268 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.268 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.268 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.268 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.269 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.269 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.269 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.269 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.269 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.269 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.270 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.270 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.270 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.270 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.270 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.270 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.270 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.271 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.272 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.272 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.272 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.272 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.272 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.272 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.273 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.273 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.273 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.273 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.273 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.273 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.273 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.274 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.274 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.274 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.274 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.274 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.274 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.275 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.275 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.275 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.275 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.275 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.275 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.275 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.276 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.277 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.277 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.277 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.277 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.277 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.277 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.277 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.278 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.279 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.279 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.279 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.279 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.279 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.279 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.novncproxy_base_url = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.280 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.280 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.280 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.280 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.280 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.280 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.280 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.281 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.281 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.281 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.281 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.281 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.281 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.281 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.282 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.283 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.283 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.283 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.283 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.283 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.283 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.283 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.284 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.285 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.285 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.285 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.285 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.285 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.285 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.286 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.286 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.286 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.286 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.286 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.286 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.286 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.287 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.288 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.288 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.288 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.288 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.288 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.288 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.288 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.289 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.290 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.290 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.290 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.290 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.290 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.290 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.290 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.291 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.292 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.292 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.292 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.292 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.292 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.292 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.292 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.auth_url = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.293 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.294 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.295 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.295 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.295 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.295 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.295 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.295 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.295 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.296 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.297 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.298 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.298 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.298 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.298 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.298 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.298 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.298 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.299 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.299 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.299 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.299 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.299 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.299 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.299 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.300 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.300 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.300 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.300 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.300 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.301 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.301 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.301 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.301 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.302 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.302 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.302 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.302 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.302 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.303 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.303 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.303 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.303 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.303 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.304 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.304 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.304 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.304 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.304 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.305 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.305 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.305 186209 DEBUG oslo_service.service [None req-a7882c41-d321-40ee-9126-5eb70c5dcfdc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.306 186209 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.322 186209 INFO nova.virt.node [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Determined node identity db8b39ad-af52-43e3-99e2-f3c431f03241 from /var/lib/nova/compute_id#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.323 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.323 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.323 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.323 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.332 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.334 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.335 186209 INFO nova.virt.libvirt.driver [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Connection event '1' reason 'None'#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.349 186209 DEBUG nova.virt.libvirt.volume.mount [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.351 186209 INFO nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Libvirt host capabilities Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 3134f11d-a070-482e-9899-7eb324eccfc9 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: x86_64 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v4 Dec 6 04:38:55 localhost nova_compute[186205]: AMD Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tcp Dec 6 04:38:55 localhost nova_compute[186205]: rdma Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 16116612 Dec 6 04:38:55 localhost nova_compute[186205]: 4029153 Dec 6 04:38:55 localhost nova_compute[186205]: 0 Dec 6 04:38:55 localhost nova_compute[186205]: 0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: selinux Dec 6 04:38:55 localhost nova_compute[186205]: 0 Dec 6 04:38:55 localhost nova_compute[186205]: system_u:system_r:svirt_t:s0 Dec 6 04:38:55 localhost nova_compute[186205]: system_u:system_r:svirt_tcg_t:s0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: dac Dec 6 04:38:55 localhost nova_compute[186205]: 0 Dec 6 04:38:55 localhost nova_compute[186205]: +107:+107 Dec 6 04:38:55 localhost nova_compute[186205]: +107:+107 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: hvm Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 32 Dec 6 04:38:55 localhost nova_compute[186205]: /usr/libexec/qemu-kvm Dec 6 04:38:55 localhost nova_compute[186205]: pc-i440fx-rhel7.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.8.0 Dec 6 04:38:55 localhost nova_compute[186205]: q35 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.4.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.5.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.3.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel7.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.4.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.2.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.2.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.0.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.0.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.1.0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: hvm Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 64 Dec 6 04:38:55 localhost nova_compute[186205]: /usr/libexec/qemu-kvm Dec 6 04:38:55 localhost nova_compute[186205]: pc-i440fx-rhel7.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.8.0 Dec 6 04:38:55 localhost nova_compute[186205]: q35 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.4.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.5.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.3.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel7.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.4.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.2.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.2.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.0.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.0.0 Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel8.1.0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: #033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.361 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.372 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/libexec/qemu-kvm Dec 6 04:38:55 localhost nova_compute[186205]: kvm Dec 6 04:38:55 localhost nova_compute[186205]: pc-i440fx-rhel7.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: i686 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: rom Dec 6 04:38:55 localhost nova_compute[186205]: pflash Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: yes Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: AMD Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 486 Dec 6 04:38:55 localhost nova_compute[186205]: 486-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Conroe Dec 6 04:38:55 localhost nova_compute[186205]: Conroe-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-IBPB Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v4 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v1 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v2 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v6 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v7 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Penryn Dec 6 04:38:55 localhost nova_compute[186205]: Penryn-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Westmere Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v2 Dec 6 04:38:55 localhost nova_compute[186205]: athlon Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: athlon-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: kvm32 Dec 6 04:38:55 localhost nova_compute[186205]: kvm32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: n270 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: n270-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pentium Dec 6 04:38:55 localhost nova_compute[186205]: pentium-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: phenom Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: phenom-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu32 Dec 6 04:38:55 localhost nova_compute[186205]: qemu32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: anonymous Dec 6 04:38:55 localhost nova_compute[186205]: memfd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: disk Dec 6 04:38:55 localhost nova_compute[186205]: cdrom Dec 6 04:38:55 localhost nova_compute[186205]: floppy Dec 6 04:38:55 localhost nova_compute[186205]: lun Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: ide Dec 6 04:38:55 localhost nova_compute[186205]: fdc Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: sata Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: vnc Dec 6 04:38:55 localhost nova_compute[186205]: egl-headless Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: subsystem Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: mandatory Dec 6 04:38:55 localhost nova_compute[186205]: requisite Dec 6 04:38:55 localhost nova_compute[186205]: optional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: pci Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: random Dec 6 04:38:55 localhost nova_compute[186205]: egd Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: path Dec 6 04:38:55 localhost nova_compute[186205]: handle Dec 6 04:38:55 localhost nova_compute[186205]: virtiofs Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tpm-tis Dec 6 04:38:55 localhost nova_compute[186205]: tpm-crb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: emulator Dec 6 04:38:55 localhost nova_compute[186205]: external Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 2.0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: passt Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: isa Dec 6 04:38:55 localhost nova_compute[186205]: hyperv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: null Dec 6 04:38:55 localhost nova_compute[186205]: vc Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: dev Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: pipe Dec 6 04:38:55 localhost nova_compute[186205]: stdio Dec 6 04:38:55 localhost nova_compute[186205]: udp Dec 6 04:38:55 localhost nova_compute[186205]: tcp Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: qemu-vdagent Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: relaxed Dec 6 04:38:55 localhost nova_compute[186205]: vapic Dec 6 04:38:55 localhost nova_compute[186205]: spinlocks Dec 6 04:38:55 localhost nova_compute[186205]: vpindex Dec 6 04:38:55 localhost nova_compute[186205]: runtime Dec 6 04:38:55 localhost nova_compute[186205]: synic Dec 6 04:38:55 localhost nova_compute[186205]: stimer Dec 6 04:38:55 localhost nova_compute[186205]: reset Dec 6 04:38:55 localhost nova_compute[186205]: vendor_id Dec 6 04:38:55 localhost nova_compute[186205]: frequencies Dec 6 04:38:55 localhost nova_compute[186205]: reenlightenment Dec 6 04:38:55 localhost nova_compute[186205]: tlbflush Dec 6 04:38:55 localhost nova_compute[186205]: ipi Dec 6 04:38:55 localhost nova_compute[186205]: avic Dec 6 04:38:55 localhost nova_compute[186205]: emsr_bitmap Dec 6 04:38:55 localhost nova_compute[186205]: xmm_input Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 4095 Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Linux KVM Hv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tdx Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.382 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/libexec/qemu-kvm Dec 6 04:38:55 localhost nova_compute[186205]: kvm Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.8.0 Dec 6 04:38:55 localhost nova_compute[186205]: i686 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: rom Dec 6 04:38:55 localhost nova_compute[186205]: pflash Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: yes Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: AMD Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 486 Dec 6 04:38:55 localhost nova_compute[186205]: 486-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Conroe Dec 6 04:38:55 localhost nova_compute[186205]: Conroe-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-IBPB Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v4 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v1 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v2 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v6 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v7 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Penryn Dec 6 04:38:55 localhost nova_compute[186205]: Penryn-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Westmere Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v2 Dec 6 04:38:55 localhost nova_compute[186205]: athlon Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: athlon-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: kvm32 Dec 6 04:38:55 localhost nova_compute[186205]: kvm32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: n270 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: n270-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pentium Dec 6 04:38:55 localhost nova_compute[186205]: pentium-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: phenom Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: phenom-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu32 Dec 6 04:38:55 localhost nova_compute[186205]: qemu32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: anonymous Dec 6 04:38:55 localhost nova_compute[186205]: memfd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: disk Dec 6 04:38:55 localhost nova_compute[186205]: cdrom Dec 6 04:38:55 localhost nova_compute[186205]: floppy Dec 6 04:38:55 localhost nova_compute[186205]: lun Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: fdc Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: sata Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: vnc Dec 6 04:38:55 localhost nova_compute[186205]: egl-headless Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: subsystem Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: mandatory Dec 6 04:38:55 localhost nova_compute[186205]: requisite Dec 6 04:38:55 localhost nova_compute[186205]: optional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: pci Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: random Dec 6 04:38:55 localhost nova_compute[186205]: egd Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: path Dec 6 04:38:55 localhost nova_compute[186205]: handle Dec 6 04:38:55 localhost nova_compute[186205]: virtiofs Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tpm-tis Dec 6 04:38:55 localhost nova_compute[186205]: tpm-crb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: emulator Dec 6 04:38:55 localhost nova_compute[186205]: external Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 2.0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: passt Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: isa Dec 6 04:38:55 localhost nova_compute[186205]: hyperv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: null Dec 6 04:38:55 localhost nova_compute[186205]: vc Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: dev Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: pipe Dec 6 04:38:55 localhost nova_compute[186205]: stdio Dec 6 04:38:55 localhost nova_compute[186205]: udp Dec 6 04:38:55 localhost nova_compute[186205]: tcp Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: qemu-vdagent Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: relaxed Dec 6 04:38:55 localhost nova_compute[186205]: vapic Dec 6 04:38:55 localhost nova_compute[186205]: spinlocks Dec 6 04:38:55 localhost nova_compute[186205]: vpindex Dec 6 04:38:55 localhost nova_compute[186205]: runtime Dec 6 04:38:55 localhost nova_compute[186205]: synic Dec 6 04:38:55 localhost nova_compute[186205]: stimer Dec 6 04:38:55 localhost nova_compute[186205]: reset Dec 6 04:38:55 localhost nova_compute[186205]: vendor_id Dec 6 04:38:55 localhost nova_compute[186205]: frequencies Dec 6 04:38:55 localhost nova_compute[186205]: reenlightenment Dec 6 04:38:55 localhost nova_compute[186205]: tlbflush Dec 6 04:38:55 localhost nova_compute[186205]: ipi Dec 6 04:38:55 localhost nova_compute[186205]: avic Dec 6 04:38:55 localhost nova_compute[186205]: emsr_bitmap Dec 6 04:38:55 localhost nova_compute[186205]: xmm_input Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 4095 Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Linux KVM Hv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tdx Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.417 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.421 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/libexec/qemu-kvm Dec 6 04:38:55 localhost nova_compute[186205]: kvm Dec 6 04:38:55 localhost nova_compute[186205]: pc-i440fx-rhel7.6.0 Dec 6 04:38:55 localhost nova_compute[186205]: x86_64 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: rom Dec 6 04:38:55 localhost nova_compute[186205]: pflash Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: yes Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: AMD Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 486 Dec 6 04:38:55 localhost nova_compute[186205]: 486-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Conroe Dec 6 04:38:55 localhost nova_compute[186205]: Conroe-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-IBPB Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v4 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v1 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v2 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v6 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v7 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Penryn Dec 6 04:38:55 localhost nova_compute[186205]: Penryn-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Westmere Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v2 Dec 6 04:38:55 localhost nova_compute[186205]: athlon Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: athlon-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: kvm32 Dec 6 04:38:55 localhost nova_compute[186205]: kvm32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: n270 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: n270-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pentium Dec 6 04:38:55 localhost nova_compute[186205]: pentium-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: phenom Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: phenom-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu32 Dec 6 04:38:55 localhost nova_compute[186205]: qemu32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: anonymous Dec 6 04:38:55 localhost nova_compute[186205]: memfd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: disk Dec 6 04:38:55 localhost nova_compute[186205]: cdrom Dec 6 04:38:55 localhost nova_compute[186205]: floppy Dec 6 04:38:55 localhost nova_compute[186205]: lun Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: ide Dec 6 04:38:55 localhost nova_compute[186205]: fdc Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: sata Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: vnc Dec 6 04:38:55 localhost nova_compute[186205]: egl-headless Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: subsystem Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: mandatory Dec 6 04:38:55 localhost nova_compute[186205]: requisite Dec 6 04:38:55 localhost nova_compute[186205]: optional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: pci Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: random Dec 6 04:38:55 localhost nova_compute[186205]: egd Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: path Dec 6 04:38:55 localhost nova_compute[186205]: handle Dec 6 04:38:55 localhost nova_compute[186205]: virtiofs Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tpm-tis Dec 6 04:38:55 localhost nova_compute[186205]: tpm-crb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: emulator Dec 6 04:38:55 localhost nova_compute[186205]: external Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 2.0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: passt Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: isa Dec 6 04:38:55 localhost nova_compute[186205]: hyperv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: null Dec 6 04:38:55 localhost nova_compute[186205]: vc Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: dev Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: pipe Dec 6 04:38:55 localhost nova_compute[186205]: stdio Dec 6 04:38:55 localhost nova_compute[186205]: udp Dec 6 04:38:55 localhost nova_compute[186205]: tcp Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: qemu-vdagent Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: relaxed Dec 6 04:38:55 localhost nova_compute[186205]: vapic Dec 6 04:38:55 localhost nova_compute[186205]: spinlocks Dec 6 04:38:55 localhost nova_compute[186205]: vpindex Dec 6 04:38:55 localhost nova_compute[186205]: runtime Dec 6 04:38:55 localhost nova_compute[186205]: synic Dec 6 04:38:55 localhost nova_compute[186205]: stimer Dec 6 04:38:55 localhost nova_compute[186205]: reset Dec 6 04:38:55 localhost nova_compute[186205]: vendor_id Dec 6 04:38:55 localhost nova_compute[186205]: frequencies Dec 6 04:38:55 localhost nova_compute[186205]: reenlightenment Dec 6 04:38:55 localhost nova_compute[186205]: tlbflush Dec 6 04:38:55 localhost nova_compute[186205]: ipi Dec 6 04:38:55 localhost nova_compute[186205]: avic Dec 6 04:38:55 localhost nova_compute[186205]: emsr_bitmap Dec 6 04:38:55 localhost nova_compute[186205]: xmm_input Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 4095 Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Linux KVM Hv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tdx Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.462 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/libexec/qemu-kvm Dec 6 04:38:55 localhost nova_compute[186205]: kvm Dec 6 04:38:55 localhost nova_compute[186205]: pc-q35-rhel9.8.0 Dec 6 04:38:55 localhost nova_compute[186205]: x86_64 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: efi Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Dec 6 04:38:55 localhost nova_compute[186205]: /usr/share/edk2/ovmf/OVMF_CODE.fd Dec 6 04:38:55 localhost nova_compute[186205]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Dec 6 04:38:55 localhost nova_compute[186205]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: rom Dec 6 04:38:55 localhost nova_compute[186205]: pflash Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: yes Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: yes Dec 6 04:38:55 localhost nova_compute[186205]: no Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: AMD Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 486 Dec 6 04:38:55 localhost nova_compute[186205]: 486-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Broadwell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cascadelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Conroe Dec 6 04:38:55 localhost nova_compute[186205]: Conroe-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Cooperlake-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Denverton-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dhyana-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Genoa-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-IBPB Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Milan-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-Rome-v4 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v1 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v2 Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: EPYC-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: GraniteRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Haswell-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-noTSX Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v6 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Icelake-Server-v7 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: IvyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: KnightsMill-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Nehalem-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G1-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G4-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Opteron_G5-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Penryn Dec 6 04:38:55 localhost nova_compute[186205]: Penryn-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: SandyBridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SapphireRapids-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: SierraForest-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Client-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-noTSX-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Skylake-Server-v5 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v2 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v3 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Snowridge-v4 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Westmere Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-IBRS Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Westmere-v2 Dec 6 04:38:55 localhost nova_compute[186205]: athlon Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: athlon-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: core2duo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: coreduo-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: kvm32 Dec 6 04:38:55 localhost nova_compute[186205]: kvm32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64 Dec 6 04:38:55 localhost nova_compute[186205]: kvm64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: n270 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: n270-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pentium Dec 6 04:38:55 localhost nova_compute[186205]: pentium-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2 Dec 6 04:38:55 localhost nova_compute[186205]: pentium2-v1 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3 Dec 6 04:38:55 localhost nova_compute[186205]: pentium3-v1 Dec 6 04:38:55 localhost nova_compute[186205]: phenom Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: phenom-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu32 Dec 6 04:38:55 localhost nova_compute[186205]: qemu32-v1 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64 Dec 6 04:38:55 localhost nova_compute[186205]: qemu64-v1 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: anonymous Dec 6 04:38:55 localhost nova_compute[186205]: memfd Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: disk Dec 6 04:38:55 localhost nova_compute[186205]: cdrom Dec 6 04:38:55 localhost nova_compute[186205]: floppy Dec 6 04:38:55 localhost nova_compute[186205]: lun Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: fdc Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: sata Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: vnc Dec 6 04:38:55 localhost nova_compute[186205]: egl-headless Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: subsystem Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: mandatory Dec 6 04:38:55 localhost nova_compute[186205]: requisite Dec 6 04:38:55 localhost nova_compute[186205]: optional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: pci Dec 6 04:38:55 localhost nova_compute[186205]: scsi Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: virtio Dec 6 04:38:55 localhost nova_compute[186205]: virtio-transitional Dec 6 04:38:55 localhost nova_compute[186205]: virtio-non-transitional Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: random Dec 6 04:38:55 localhost nova_compute[186205]: egd Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: path Dec 6 04:38:55 localhost nova_compute[186205]: handle Dec 6 04:38:55 localhost nova_compute[186205]: virtiofs Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tpm-tis Dec 6 04:38:55 localhost nova_compute[186205]: tpm-crb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: emulator Dec 6 04:38:55 localhost nova_compute[186205]: external Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 2.0 Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: usb Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: qemu Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: builtin Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: default Dec 6 04:38:55 localhost nova_compute[186205]: passt Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: isa Dec 6 04:38:55 localhost nova_compute[186205]: hyperv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: null Dec 6 04:38:55 localhost nova_compute[186205]: vc Dec 6 04:38:55 localhost nova_compute[186205]: pty Dec 6 04:38:55 localhost nova_compute[186205]: dev Dec 6 04:38:55 localhost nova_compute[186205]: file Dec 6 04:38:55 localhost nova_compute[186205]: pipe Dec 6 04:38:55 localhost nova_compute[186205]: stdio Dec 6 04:38:55 localhost nova_compute[186205]: udp Dec 6 04:38:55 localhost nova_compute[186205]: tcp Dec 6 04:38:55 localhost nova_compute[186205]: unix Dec 6 04:38:55 localhost nova_compute[186205]: qemu-vdagent Dec 6 04:38:55 localhost nova_compute[186205]: dbus Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: relaxed Dec 6 04:38:55 localhost nova_compute[186205]: vapic Dec 6 04:38:55 localhost nova_compute[186205]: spinlocks Dec 6 04:38:55 localhost nova_compute[186205]: vpindex Dec 6 04:38:55 localhost nova_compute[186205]: runtime Dec 6 04:38:55 localhost nova_compute[186205]: synic Dec 6 04:38:55 localhost nova_compute[186205]: stimer Dec 6 04:38:55 localhost nova_compute[186205]: reset Dec 6 04:38:55 localhost nova_compute[186205]: vendor_id Dec 6 04:38:55 localhost nova_compute[186205]: frequencies Dec 6 04:38:55 localhost nova_compute[186205]: reenlightenment Dec 6 04:38:55 localhost nova_compute[186205]: tlbflush Dec 6 04:38:55 localhost nova_compute[186205]: ipi Dec 6 04:38:55 localhost nova_compute[186205]: avic Dec 6 04:38:55 localhost nova_compute[186205]: emsr_bitmap Dec 6 04:38:55 localhost nova_compute[186205]: xmm_input Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: 4095 Dec 6 04:38:55 localhost nova_compute[186205]: on Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: off Dec 6 04:38:55 localhost nova_compute[186205]: Linux KVM Hv Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: tdx Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: Dec 6 04:38:55 localhost nova_compute[186205]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.506 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.506 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.506 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.507 186209 INFO nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Secure Boot support detected#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.508 186209 INFO nova.virt.libvirt.driver [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.508 186209 INFO nova.virt.libvirt.driver [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.518 186209 DEBUG nova.virt.libvirt.driver [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.574 186209 INFO nova.virt.node [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Determined node identity db8b39ad-af52-43e3-99e2-f3c431f03241 from /var/lib/nova/compute_id#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.592 186209 DEBUG nova.compute.manager [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Verified node db8b39ad-af52-43e3-99e2-f3c431f03241 matches my host np0005548798.ooo.test _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.637 186209 DEBUG nova.compute.manager [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.645 186209 DEBUG nova.virt.libvirt.vif [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:38:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005548798.ooo.test',hostname='test',id=2,image_ref='c6562616-bf77-48e6-bb05-431e64af083a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T08:38:42Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005548798.ooo.test',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='47835b89168945138751a4b216280589',ramdisk_id='',reservation_id='r-h8mij0z5',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-12-06T08:38:43Z,user_data=None,user_id='5220ceda9e4145d395f52fc9fd0365c0',uuid=a5070ada-6b60-4992-a1bf-9e83aaccac93,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.645 186209 DEBUG nova.network.os_vif_util [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Converting VIF {"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.645 186209 DEBUG nova.network.os_vif_util [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.646 186209 DEBUG os_vif [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.671 186209 DEBUG ovsdbapp.backend.ovs_idl [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.671 186209 DEBUG ovsdbapp.backend.ovs_idl [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.671 186209 DEBUG ovsdbapp.backend.ovs_idl [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.672 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.672 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.672 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.673 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.674 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.677 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.688 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.688 186209 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.688 186209 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:38:55 localhost nova_compute[186205]: 2025-12-06 09:38:55.689 186209 INFO oslo.privsep.daemon [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp29qs22hk/privsep.sock']#033[00m Dec 6 04:38:55 localhost python3.9[186465]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:38:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7291 DF PROTO=TCP SPT=52832 DPT=9102 SEQ=1463454176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB597870000000001030307) Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.291 186209 INFO oslo.privsep.daemon [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.179 186488 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.184 186488 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.188 186488 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.188 186488 INFO oslo.privsep.daemon [-] privsep daemon running as pid 186488#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.550 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.550 186209 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap227fe5b2-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.551 186209 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap227fe5b2-a5, col_values=(('external_ids', {'iface-id': '227fe5b2-a5a7-4043-b641-32b6e7c7a7c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:02:64', 'vm-uuid': 'a5070ada-6b60-4992-a1bf-9e83aaccac93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.552 186209 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.553 186209 INFO os_vif [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5')#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.553 186209 DEBUG nova.compute.manager [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.559 186209 DEBUG nova.compute.manager [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.560 186209 INFO nova.compute.manager [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Dec 6 04:38:56 localhost python3.9[186579]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:38:56 localhost nova_compute[186205]: 2025-12-06 09:38:56.799 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.058 186209 INFO nova.service [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updating service version for nova-compute on np0005548798.ooo.test from 57 to 66#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.090 186209 DEBUG oslo_concurrency.lockutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.090 186209 DEBUG oslo_concurrency.lockutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.091 186209 DEBUG oslo_concurrency.lockutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.091 186209 DEBUG nova.compute.resource_tracker [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.156 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.225 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.226 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.276 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.278 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.348 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.349 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.424 186209 DEBUG oslo_concurrency.processutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:38:57 localhost systemd[1]: Started libvirt nodedev daemon. Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.757 186209 WARNING nova.virt.libvirt.driver [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.759 186209 DEBUG nova.compute.resource_tracker [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=13290MB free_disk=387.46332931518555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.759 186209 DEBUG oslo_concurrency.lockutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.759 186209 DEBUG oslo_concurrency.lockutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.883 186209 DEBUG nova.compute.resource_tracker [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.883 186209 DEBUG nova.compute.resource_tracker [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.884 186209 DEBUG nova.compute.resource_tracker [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.945 186209 DEBUG nova.scheduler.client.report [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.965 186209 DEBUG nova.scheduler.client.report [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 0, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.965 186209 DEBUG nova.compute.provider_tree [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 0, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:38:57 localhost nova_compute[186205]: 2025-12-06 09:38:57.982 186209 DEBUG nova.scheduler.client.report [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.005 186209 DEBUG nova.scheduler.client.report [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,COMPUTE_DEVICE_TAGGING,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_ACCELERATORS,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.048 186209 DEBUG nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Dec 6 04:38:58 localhost nova_compute[186205]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.048 186209 INFO nova.virt.libvirt.host [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] kernel doesn't support AMD SEV#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.050 186209 DEBUG nova.compute.provider_tree [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 399, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.050 186209 DEBUG nova.virt.libvirt.driver [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.099 186209 DEBUG nova.scheduler.client.report [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updated inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 399, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.099 186209 DEBUG nova.compute.provider_tree [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updating resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.100 186209 DEBUG nova.compute.provider_tree [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.199 186209 DEBUG nova.compute.provider_tree [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Updating resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.714 186209 DEBUG nova.compute.resource_tracker [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.715 186209 DEBUG oslo_concurrency.lockutils [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.715 186209 DEBUG nova.service [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.834 186209 DEBUG nova.service [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Dec 6 04:38:58 localhost nova_compute[186205]: 2025-12-06 09:38:58.835 186209 DEBUG nova.servicegroup.drivers.db [None req-45eceb3c-cf24-4882-927c-5b6888c7b4fe - - - - - -] DB_Driver: join new ServiceGroup member np0005548798.ooo.test to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Dec 6 04:39:00 localhost python3.9[186953]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 6 04:39:00 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 115.9 (386 of 333 items), suggesting rotation. Dec 6 04:39:00 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 04:39:00 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:39:00 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:39:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44601 DF PROTO=TCP SPT=53438 DPT=9101 SEQ=2163712551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5A8070000000001030307) Dec 6 04:39:00 localhost nova_compute[186205]: 2025-12-06 09:39:00.693 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44602 DF PROTO=TCP SPT=53438 DPT=9101 SEQ=2163712551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5AC070000000001030307) Dec 6 04:39:01 localhost nova_compute[186205]: 2025-12-06 09:39:01.851 186209 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:02 localhost python3.9[187110]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:39:02 localhost systemd[1]: Stopping nova_compute container... Dec 6 04:39:02 localhost systemd[1]: libpod-5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486.scope: Deactivated successfully. Dec 6 04:39:02 localhost journal[161777]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Dec 6 04:39:02 localhost journal[161777]: hostname: np0005548798.ooo.test Dec 6 04:39:02 localhost journal[161777]: End of file while reading data: Input/output error Dec 6 04:39:02 localhost systemd[1]: libpod-5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486.scope: Consumed 3.850s CPU time. Dec 6 04:39:02 localhost podman[187114]: 2025-12-06 09:39:02.5854547 +0000 UTC m=+0.091451348 container died 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:39:02 localhost systemd[1]: tmp-crun.xqDnit.mount: Deactivated successfully. Dec 6 04:39:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486-userdata-shm.mount: Deactivated successfully. Dec 6 04:39:02 localhost podman[187114]: 2025-12-06 09:39:02.664030544 +0000 UTC m=+0.170027192 container cleanup 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:39:02 localhost podman[187114]: nova_compute Dec 6 04:39:02 localhost podman[187156]: error opening file `/run/crun/5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486/status`: No such file or directory Dec 6 04:39:02 localhost podman[187142]: 2025-12-06 09:39:02.757809273 +0000 UTC m=+0.060547172 container cleanup 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:39:02 localhost podman[187142]: nova_compute Dec 6 04:39:02 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Dec 6 04:39:02 localhost systemd[1]: Stopped nova_compute container. Dec 6 04:39:02 localhost systemd[1]: Starting nova_compute container... Dec 6 04:39:02 localhost systemd[1]: Started libcrun container. Dec 6 04:39:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:02 localhost podman[187158]: 2025-12-06 09:39:02.906802128 +0000 UTC m=+0.108041286 container init 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=nova_compute) Dec 6 04:39:02 localhost nova_compute[187174]: + sudo -E kolla_set_configs Dec 6 04:39:02 localhost podman[187158]: 2025-12-06 09:39:02.921036183 +0000 UTC m=+0.122275281 container start 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 04:39:02 localhost podman[187158]: nova_compute Dec 6 04:39:02 localhost systemd[1]: Started nova_compute container. Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Validating config file Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying service configuration files Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /etc/nova/nova.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /etc/nova/nova.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /etc/ceph Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Creating directory /etc/ceph Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /etc/ceph Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Deleting /usr/sbin/iscsiadm Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Writing out command to execute Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:39:03 localhost nova_compute[187174]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 6 04:39:03 localhost nova_compute[187174]: ++ cat /run_command Dec 6 04:39:03 localhost nova_compute[187174]: + CMD=nova-compute Dec 6 04:39:03 localhost nova_compute[187174]: + ARGS= Dec 6 04:39:03 localhost nova_compute[187174]: + sudo kolla_copy_cacerts Dec 6 04:39:03 localhost nova_compute[187174]: + [[ ! -n '' ]] Dec 6 04:39:03 localhost nova_compute[187174]: + . kolla_extend_start Dec 6 04:39:03 localhost nova_compute[187174]: Running command: 'nova-compute' Dec 6 04:39:03 localhost nova_compute[187174]: + echo 'Running command: '\''nova-compute'\''' Dec 6 04:39:03 localhost nova_compute[187174]: + umask 0022 Dec 6 04:39:03 localhost nova_compute[187174]: + exec nova-compute Dec 6 04:39:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51782 DF PROTO=TCP SPT=54354 DPT=9101 SEQ=801853975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5B7880000000001030307) Dec 6 04:39:04 localhost nova_compute[187174]: 2025-12-06 09:39:04.728 187178 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:39:04 localhost nova_compute[187174]: 2025-12-06 09:39:04.729 187178 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:39:04 localhost nova_compute[187174]: 2025-12-06 09:39:04.729 187178 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:39:04 localhost nova_compute[187174]: 2025-12-06 09:39:04.729 187178 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Dec 6 04:39:04 localhost nova_compute[187174]: 2025-12-06 09:39:04.845 187178 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:39:04 localhost nova_compute[187174]: 2025-12-06 09:39:04.866 187178 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:39:04 localhost nova_compute[187174]: 2025-12-06 09:39:04.866 187178 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.296 187178 INFO nova.virt.driver [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.420 187178 INFO nova.compute.provider_config [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.428 187178 WARNING nova.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.429 187178 DEBUG oslo_concurrency.lockutils [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.429 187178 DEBUG oslo_concurrency.lockutils [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.429 187178 DEBUG oslo_concurrency.lockutils [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.429 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.429 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.430 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.430 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.430 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.430 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.430 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.430 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.430 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.431 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.431 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.431 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.431 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.431 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.431 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.431 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.432 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.432 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.432 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.432 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] console_host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.432 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.432 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.432 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.433 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.433 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.433 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.433 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.433 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.433 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.434 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.434 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.434 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.434 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.434 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.434 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.434 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.435 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.435 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.435 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.435 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.435 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.436 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.436 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.436 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.436 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.436 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.436 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.436 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.437 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.437 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.437 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.437 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.437 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.437 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.438 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.438 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.438 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.438 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.438 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.438 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.438 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.439 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.439 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.439 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.439 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.439 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.439 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.439 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.440 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.440 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.440 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.440 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.440 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.440 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.440 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.441 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.441 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.441 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.441 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.441 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.441 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.441 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.442 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.442 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.442 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.442 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.442 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.442 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.442 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.443 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.443 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.443 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.443 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.443 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.443 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.443 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.444 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.444 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.444 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.444 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.444 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.444 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.444 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.445 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.446 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.446 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.446 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.446 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.446 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.446 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.447 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.447 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.447 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.447 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.447 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.447 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.447 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.448 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.448 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.448 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.448 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.448 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.448 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.448 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.449 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.449 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.449 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.449 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.449 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.449 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.449 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.450 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.451 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.451 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.451 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.451 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.451 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.451 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.452 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.452 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.452 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.452 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.452 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.452 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.452 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.453 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.453 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.453 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.453 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.453 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.453 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.453 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.454 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.454 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.454 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.454 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.454 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.454 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.454 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.455 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.455 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.455 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.455 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.455 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.455 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.455 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.456 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.456 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.456 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.456 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.456 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.456 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.456 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.457 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.457 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.457 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.457 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.457 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.457 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.457 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.458 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.459 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.459 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.459 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.459 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.459 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.459 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.459 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.460 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.460 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.460 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.460 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.460 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.460 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.461 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.461 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.461 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.461 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.461 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.461 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.461 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.462 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.463 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.463 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.463 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.463 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.463 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.463 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.463 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.464 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.464 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.464 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.464 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.464 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.464 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.464 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.465 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.465 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.465 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.465 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.465 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.465 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.465 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.466 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.466 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.466 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.466 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.466 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.466 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.466 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.467 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.467 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.467 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.467 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.467 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.467 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.467 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.468 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.468 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.468 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.468 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.468 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.468 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.468 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.469 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.469 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.469 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.469 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.469 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.469 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.469 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.470 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.471 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.471 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.471 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.471 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.471 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.471 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.471 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.472 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.472 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.472 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.472 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.472 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.472 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.472 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.473 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.473 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.473 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.473 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.473 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.473 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.473 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.474 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.474 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.474 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.474 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.474 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.474 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.474 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.475 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.475 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.475 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.475 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.475 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.475 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.475 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.476 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.476 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.476 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.476 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.476 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.476 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.476 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.477 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.477 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.477 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.477 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.477 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.477 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.478 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.479 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.479 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.479 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.479 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.479 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.479 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.479 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.480 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.480 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.480 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.480 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.480 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.480 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.481 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.481 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.481 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.481 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.481 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.481 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.482 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.483 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.483 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.483 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.483 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.483 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.483 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.483 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.484 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.485 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.485 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.485 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.485 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.485 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.485 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.485 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.486 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.486 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.486 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.486 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.486 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.486 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.486 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.487 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.487 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.487 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.487 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.487 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.487 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.487 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.488 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.489 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.489 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.489 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.489 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.489 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.489 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.489 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.490 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.490 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.490 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.490 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.490 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.490 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.490 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.491 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.491 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.491 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.491 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.491 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.491 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.491 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.492 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.493 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.493 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.493 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.493 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.493 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.493 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.494 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.494 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.494 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.494 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.494 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.494 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.494 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.495 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.495 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.495 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.495 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.495 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.495 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.495 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.496 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.496 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.496 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.496 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.496 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.496 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.images_rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.496 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.497 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.497 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.images_rbd_glance_store_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.497 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.images_rbd_pool = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.497 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.images_type = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.497 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.497 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.497 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.498 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.498 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.498 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.498 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.498 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.498 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.498 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.499 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.500 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.500 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.500 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.500 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.500 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.500 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.501 187178 WARNING oslo_config.cfg [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Dec 6 04:39:05 localhost nova_compute[187174]: live_migration_uri is deprecated for removal in favor of two other options that Dec 6 04:39:05 localhost nova_compute[187174]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Dec 6 04:39:05 localhost nova_compute[187174]: and ``live_migration_inbound_addr`` respectively. Dec 6 04:39:05 localhost nova_compute[187174]: ). Its value may be silently ignored in the future.#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.501 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_uri = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.501 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.501 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.501 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.501 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.502 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.502 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.502 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.502 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.502 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.502 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.502 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.503 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.503 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.503 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.503 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.503 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.503 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.503 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rbd_secret_uuid = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.504 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rbd_user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.504 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.504 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.504 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.504 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.504 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.504 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.505 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.505 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.505 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.505 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.505 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.505 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.505 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.506 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.506 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.506 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.506 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.506 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.506 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.506 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.507 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.507 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.507 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.507 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.507 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.507 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.507 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.508 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.508 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.508 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.508 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.508 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.508 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.508 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.509 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.509 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.509 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.509 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.509 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.509 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.509 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.510 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.511 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.511 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.511 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.511 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.511 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.511 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.511 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.512 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.512 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.512 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.512 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.512 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.512 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.512 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.513 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.513 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.513 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.513 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.513 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.513 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.513 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.514 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.auth_url = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.514 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.514 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.514 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.514 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.514 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.514 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.515 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.516 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.516 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.516 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.516 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.516 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.516 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.516 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.517 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.517 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.517 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.517 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.517 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.517 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.517 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.518 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.519 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.519 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.519 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.519 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.519 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.519 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.519 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.520 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.520 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.520 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.520 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.520 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.520 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.521 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.521 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.521 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.521 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.521 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.521 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.522 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.522 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.522 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.522 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.522 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.522 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.522 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.523 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.523 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.523 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.523 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.523 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.524 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.524 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.524 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.524 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.524 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.524 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.524 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.525 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.526 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.526 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.526 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.526 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.526 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.526 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.527 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.527 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.527 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.527 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.527 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.527 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.527 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.528 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.528 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.528 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.528 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.528 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.528 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.528 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.529 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.529 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.529 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.529 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.529 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.529 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.529 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.530 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.530 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.530 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.530 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.530 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.530 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.530 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.531 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.531 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.531 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.531 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.531 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.531 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.531 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.532 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.532 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.532 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.532 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.532 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.532 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.532 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.533 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.533 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.533 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.533 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.533 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.533 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.533 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.534 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.534 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.534 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.534 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.534 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.534 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.534 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.535 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.535 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.535 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.535 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.535 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.535 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.535 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.536 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.536 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.536 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.536 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.536 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.536 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.537 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.novncproxy_base_url = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.537 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.537 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.537 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.537 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.537 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.537 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.538 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.538 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.538 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.538 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.538 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.538 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.538 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.539 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.539 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.539 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.539 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.539 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.539 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.539 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.540 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.541 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.541 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.541 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.541 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.541 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.541 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.541 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.542 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.542 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.542 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.542 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.542 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.542 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.543 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.543 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.543 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.543 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.543 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.543 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.543 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.544 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.544 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.544 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.544 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.544 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.544 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.544 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.545 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.545 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.545 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.545 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.545 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.545 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.546 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.546 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.546 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.546 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.546 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.546 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.546 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.547 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.547 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.549 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.549 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.550 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.550 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.550 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.551 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.551 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.551 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.552 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.552 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.552 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.553 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.553 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.554 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.554 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.554 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.555 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.555 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.555 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.556 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.556 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.556 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.556 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.557 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.557 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.auth_url = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.557 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.558 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.558 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.558 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.559 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.559 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.559 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.559 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.560 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.560 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.560 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.561 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.561 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.561 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.561 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.562 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.562 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.562 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.563 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.563 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.563 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.563 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.564 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.564 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.564 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.565 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.565 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.565 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.565 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.566 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.566 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.566 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.567 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.567 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.567 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.567 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.568 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.568 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.568 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.569 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.569 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.569 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.569 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.570 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.570 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.570 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.571 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.571 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.571 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.571 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.572 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.572 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.572 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.573 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.573 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.573 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.574 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.574 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.574 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.574 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.575 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.575 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.575 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.576 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.576 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.576 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.576 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.577 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.577 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.577 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.578 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.578 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.578 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.578 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.579 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.579 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.579 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.580 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.580 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.580 187178 DEBUG oslo_service.service [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.582 187178 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.596 187178 INFO nova.virt.node [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Determined node identity db8b39ad-af52-43e3-99e2-f3c431f03241 from /var/lib/nova/compute_id#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.597 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.598 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.598 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.599 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.611 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.619 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.621 187178 INFO nova.virt.libvirt.driver [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Connection event '1' reason 'None'#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.629 187178 INFO nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Libvirt host capabilities Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 3134f11d-a070-482e-9899-7eb324eccfc9 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: x86_64 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v4 Dec 6 04:39:05 localhost nova_compute[187174]: AMD Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tcp Dec 6 04:39:05 localhost nova_compute[187174]: rdma Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 16116612 Dec 6 04:39:05 localhost nova_compute[187174]: 4029153 Dec 6 04:39:05 localhost nova_compute[187174]: 0 Dec 6 04:39:05 localhost nova_compute[187174]: 0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: selinux Dec 6 04:39:05 localhost nova_compute[187174]: 0 Dec 6 04:39:05 localhost nova_compute[187174]: system_u:system_r:svirt_t:s0 Dec 6 04:39:05 localhost nova_compute[187174]: system_u:system_r:svirt_tcg_t:s0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: dac Dec 6 04:39:05 localhost nova_compute[187174]: 0 Dec 6 04:39:05 localhost nova_compute[187174]: +107:+107 Dec 6 04:39:05 localhost nova_compute[187174]: +107:+107 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: hvm Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 32 Dec 6 04:39:05 localhost nova_compute[187174]: /usr/libexec/qemu-kvm Dec 6 04:39:05 localhost nova_compute[187174]: pc-i440fx-rhel7.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.8.0 Dec 6 04:39:05 localhost nova_compute[187174]: q35 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.4.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.5.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.3.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel7.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.4.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.2.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.2.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.0.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.0.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.1.0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: hvm Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 64 Dec 6 04:39:05 localhost nova_compute[187174]: /usr/libexec/qemu-kvm Dec 6 04:39:05 localhost nova_compute[187174]: pc-i440fx-rhel7.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.8.0 Dec 6 04:39:05 localhost nova_compute[187174]: q35 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.4.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.5.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.3.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel7.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.4.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.2.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.2.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.0.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.0.0 Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel8.1.0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: #033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.634 187178 DEBUG nova.virt.libvirt.volume.mount [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.642 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.650 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/libexec/qemu-kvm Dec 6 04:39:05 localhost nova_compute[187174]: kvm Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.8.0 Dec 6 04:39:05 localhost nova_compute[187174]: i686 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: rom Dec 6 04:39:05 localhost nova_compute[187174]: pflash Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: yes Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: AMD Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 486 Dec 6 04:39:05 localhost nova_compute[187174]: 486-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Conroe Dec 6 04:39:05 localhost nova_compute[187174]: Conroe-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-IBPB Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v4 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v1 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v2 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v6 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v7 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Penryn Dec 6 04:39:05 localhost nova_compute[187174]: Penryn-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Westmere Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v2 Dec 6 04:39:05 localhost nova_compute[187174]: athlon Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: athlon-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: kvm32 Dec 6 04:39:05 localhost nova_compute[187174]: kvm32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: n270 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: n270-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pentium Dec 6 04:39:05 localhost nova_compute[187174]: pentium-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: phenom Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: phenom-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu32 Dec 6 04:39:05 localhost nova_compute[187174]: qemu32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: anonymous Dec 6 04:39:05 localhost nova_compute[187174]: memfd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: disk Dec 6 04:39:05 localhost nova_compute[187174]: cdrom Dec 6 04:39:05 localhost nova_compute[187174]: floppy Dec 6 04:39:05 localhost nova_compute[187174]: lun Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: fdc Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: sata Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: vnc Dec 6 04:39:05 localhost nova_compute[187174]: egl-headless Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: subsystem Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: mandatory Dec 6 04:39:05 localhost nova_compute[187174]: requisite Dec 6 04:39:05 localhost nova_compute[187174]: optional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: pci Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: random Dec 6 04:39:05 localhost nova_compute[187174]: egd Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: path Dec 6 04:39:05 localhost nova_compute[187174]: handle Dec 6 04:39:05 localhost nova_compute[187174]: virtiofs Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tpm-tis Dec 6 04:39:05 localhost nova_compute[187174]: tpm-crb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: emulator Dec 6 04:39:05 localhost nova_compute[187174]: external Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 2.0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: passt Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: isa Dec 6 04:39:05 localhost nova_compute[187174]: hyperv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: null Dec 6 04:39:05 localhost nova_compute[187174]: vc Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: dev Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: pipe Dec 6 04:39:05 localhost nova_compute[187174]: stdio Dec 6 04:39:05 localhost nova_compute[187174]: udp Dec 6 04:39:05 localhost nova_compute[187174]: tcp Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: qemu-vdagent Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: relaxed Dec 6 04:39:05 localhost nova_compute[187174]: vapic Dec 6 04:39:05 localhost nova_compute[187174]: spinlocks Dec 6 04:39:05 localhost nova_compute[187174]: vpindex Dec 6 04:39:05 localhost nova_compute[187174]: runtime Dec 6 04:39:05 localhost nova_compute[187174]: synic Dec 6 04:39:05 localhost nova_compute[187174]: stimer Dec 6 04:39:05 localhost nova_compute[187174]: reset Dec 6 04:39:05 localhost nova_compute[187174]: vendor_id Dec 6 04:39:05 localhost nova_compute[187174]: frequencies Dec 6 04:39:05 localhost nova_compute[187174]: reenlightenment Dec 6 04:39:05 localhost nova_compute[187174]: tlbflush Dec 6 04:39:05 localhost nova_compute[187174]: ipi Dec 6 04:39:05 localhost nova_compute[187174]: avic Dec 6 04:39:05 localhost nova_compute[187174]: emsr_bitmap Dec 6 04:39:05 localhost nova_compute[187174]: xmm_input Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 4095 Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Linux KVM Hv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tdx Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.661 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/libexec/qemu-kvm Dec 6 04:39:05 localhost nova_compute[187174]: kvm Dec 6 04:39:05 localhost nova_compute[187174]: pc-i440fx-rhel7.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: i686 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: rom Dec 6 04:39:05 localhost nova_compute[187174]: pflash Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: yes Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: AMD Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 486 Dec 6 04:39:05 localhost nova_compute[187174]: 486-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Conroe Dec 6 04:39:05 localhost nova_compute[187174]: Conroe-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-IBPB Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v4 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v1 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v2 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v6 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v7 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Penryn Dec 6 04:39:05 localhost nova_compute[187174]: Penryn-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Westmere Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v2 Dec 6 04:39:05 localhost nova_compute[187174]: athlon Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: athlon-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: kvm32 Dec 6 04:39:05 localhost nova_compute[187174]: kvm32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: n270 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: n270-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pentium Dec 6 04:39:05 localhost nova_compute[187174]: pentium-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: phenom Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: phenom-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu32 Dec 6 04:39:05 localhost nova_compute[187174]: qemu32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: anonymous Dec 6 04:39:05 localhost nova_compute[187174]: memfd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: disk Dec 6 04:39:05 localhost nova_compute[187174]: cdrom Dec 6 04:39:05 localhost nova_compute[187174]: floppy Dec 6 04:39:05 localhost nova_compute[187174]: lun Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: ide Dec 6 04:39:05 localhost nova_compute[187174]: fdc Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: sata Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: vnc Dec 6 04:39:05 localhost nova_compute[187174]: egl-headless Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: subsystem Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: mandatory Dec 6 04:39:05 localhost nova_compute[187174]: requisite Dec 6 04:39:05 localhost nova_compute[187174]: optional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: pci Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: random Dec 6 04:39:05 localhost nova_compute[187174]: egd Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: path Dec 6 04:39:05 localhost nova_compute[187174]: handle Dec 6 04:39:05 localhost nova_compute[187174]: virtiofs Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tpm-tis Dec 6 04:39:05 localhost nova_compute[187174]: tpm-crb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: emulator Dec 6 04:39:05 localhost nova_compute[187174]: external Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 2.0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: passt Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: isa Dec 6 04:39:05 localhost nova_compute[187174]: hyperv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: null Dec 6 04:39:05 localhost nova_compute[187174]: vc Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: dev Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: pipe Dec 6 04:39:05 localhost nova_compute[187174]: stdio Dec 6 04:39:05 localhost nova_compute[187174]: udp Dec 6 04:39:05 localhost nova_compute[187174]: tcp Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: qemu-vdagent Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: relaxed Dec 6 04:39:05 localhost nova_compute[187174]: vapic Dec 6 04:39:05 localhost nova_compute[187174]: spinlocks Dec 6 04:39:05 localhost nova_compute[187174]: vpindex Dec 6 04:39:05 localhost nova_compute[187174]: runtime Dec 6 04:39:05 localhost nova_compute[187174]: synic Dec 6 04:39:05 localhost nova_compute[187174]: stimer Dec 6 04:39:05 localhost nova_compute[187174]: reset Dec 6 04:39:05 localhost nova_compute[187174]: vendor_id Dec 6 04:39:05 localhost nova_compute[187174]: frequencies Dec 6 04:39:05 localhost nova_compute[187174]: reenlightenment Dec 6 04:39:05 localhost nova_compute[187174]: tlbflush Dec 6 04:39:05 localhost nova_compute[187174]: ipi Dec 6 04:39:05 localhost nova_compute[187174]: avic Dec 6 04:39:05 localhost nova_compute[187174]: emsr_bitmap Dec 6 04:39:05 localhost nova_compute[187174]: xmm_input Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 4095 Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Linux KVM Hv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tdx Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.694 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.742 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/libexec/qemu-kvm Dec 6 04:39:05 localhost nova_compute[187174]: kvm Dec 6 04:39:05 localhost nova_compute[187174]: pc-q35-rhel9.8.0 Dec 6 04:39:05 localhost nova_compute[187174]: x86_64 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: efi Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Dec 6 04:39:05 localhost nova_compute[187174]: /usr/share/edk2/ovmf/OVMF_CODE.fd Dec 6 04:39:05 localhost nova_compute[187174]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Dec 6 04:39:05 localhost nova_compute[187174]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: rom Dec 6 04:39:05 localhost nova_compute[187174]: pflash Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: yes Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: yes Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: AMD Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 486 Dec 6 04:39:05 localhost nova_compute[187174]: 486-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Conroe Dec 6 04:39:05 localhost nova_compute[187174]: Conroe-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-IBPB Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v4 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v1 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v2 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v6 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v7 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Penryn Dec 6 04:39:05 localhost nova_compute[187174]: Penryn-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Westmere Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v2 Dec 6 04:39:05 localhost nova_compute[187174]: athlon Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: athlon-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: kvm32 Dec 6 04:39:05 localhost nova_compute[187174]: kvm32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: n270 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: n270-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pentium Dec 6 04:39:05 localhost nova_compute[187174]: pentium-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: phenom Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: phenom-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu32 Dec 6 04:39:05 localhost nova_compute[187174]: qemu32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: anonymous Dec 6 04:39:05 localhost nova_compute[187174]: memfd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: disk Dec 6 04:39:05 localhost nova_compute[187174]: cdrom Dec 6 04:39:05 localhost nova_compute[187174]: floppy Dec 6 04:39:05 localhost nova_compute[187174]: lun Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: fdc Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: sata Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: vnc Dec 6 04:39:05 localhost nova_compute[187174]: egl-headless Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: subsystem Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: mandatory Dec 6 04:39:05 localhost nova_compute[187174]: requisite Dec 6 04:39:05 localhost nova_compute[187174]: optional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: pci Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: random Dec 6 04:39:05 localhost nova_compute[187174]: egd Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: path Dec 6 04:39:05 localhost nova_compute[187174]: handle Dec 6 04:39:05 localhost nova_compute[187174]: virtiofs Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tpm-tis Dec 6 04:39:05 localhost nova_compute[187174]: tpm-crb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: emulator Dec 6 04:39:05 localhost nova_compute[187174]: external Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 2.0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: passt Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: isa Dec 6 04:39:05 localhost nova_compute[187174]: hyperv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: null Dec 6 04:39:05 localhost nova_compute[187174]: vc Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: dev Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: pipe Dec 6 04:39:05 localhost nova_compute[187174]: stdio Dec 6 04:39:05 localhost nova_compute[187174]: udp Dec 6 04:39:05 localhost nova_compute[187174]: tcp Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: qemu-vdagent Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: relaxed Dec 6 04:39:05 localhost nova_compute[187174]: vapic Dec 6 04:39:05 localhost nova_compute[187174]: spinlocks Dec 6 04:39:05 localhost nova_compute[187174]: vpindex Dec 6 04:39:05 localhost nova_compute[187174]: runtime Dec 6 04:39:05 localhost nova_compute[187174]: synic Dec 6 04:39:05 localhost nova_compute[187174]: stimer Dec 6 04:39:05 localhost nova_compute[187174]: reset Dec 6 04:39:05 localhost nova_compute[187174]: vendor_id Dec 6 04:39:05 localhost nova_compute[187174]: frequencies Dec 6 04:39:05 localhost nova_compute[187174]: reenlightenment Dec 6 04:39:05 localhost nova_compute[187174]: tlbflush Dec 6 04:39:05 localhost nova_compute[187174]: ipi Dec 6 04:39:05 localhost nova_compute[187174]: avic Dec 6 04:39:05 localhost nova_compute[187174]: emsr_bitmap Dec 6 04:39:05 localhost nova_compute[187174]: xmm_input Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 4095 Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Linux KVM Hv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tdx Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.805 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/libexec/qemu-kvm Dec 6 04:39:05 localhost nova_compute[187174]: kvm Dec 6 04:39:05 localhost nova_compute[187174]: pc-i440fx-rhel7.6.0 Dec 6 04:39:05 localhost nova_compute[187174]: x86_64 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: rom Dec 6 04:39:05 localhost nova_compute[187174]: pflash Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: yes Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: no Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: AMD Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 486 Dec 6 04:39:05 localhost nova_compute[187174]: 486-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Broadwell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cascadelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Conroe Dec 6 04:39:05 localhost nova_compute[187174]: Conroe-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Cooperlake-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Denverton-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dhyana-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Genoa-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-IBPB Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Milan-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-Rome-v4 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v1 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v2 Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: EPYC-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: GraniteRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Haswell-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-noTSX Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v6 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Icelake-Server-v7 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: IvyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: KnightsMill-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Nehalem-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G1-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G4-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Opteron_G5-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Penryn Dec 6 04:39:05 localhost nova_compute[187174]: Penryn-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: SandyBridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SapphireRapids-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: SierraForest-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Client-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-noTSX-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Skylake-Server-v5 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v2 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v3 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Snowridge-v4 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Westmere Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-IBRS Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Westmere-v2 Dec 6 04:39:05 localhost nova_compute[187174]: athlon Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: athlon-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: core2duo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: coreduo-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: kvm32 Dec 6 04:39:05 localhost nova_compute[187174]: kvm32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64 Dec 6 04:39:05 localhost nova_compute[187174]: kvm64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: n270 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: n270-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pentium Dec 6 04:39:05 localhost nova_compute[187174]: pentium-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2 Dec 6 04:39:05 localhost nova_compute[187174]: pentium2-v1 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3 Dec 6 04:39:05 localhost nova_compute[187174]: pentium3-v1 Dec 6 04:39:05 localhost nova_compute[187174]: phenom Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: phenom-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu32 Dec 6 04:39:05 localhost nova_compute[187174]: qemu32-v1 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64 Dec 6 04:39:05 localhost nova_compute[187174]: qemu64-v1 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: anonymous Dec 6 04:39:05 localhost nova_compute[187174]: memfd Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: disk Dec 6 04:39:05 localhost nova_compute[187174]: cdrom Dec 6 04:39:05 localhost nova_compute[187174]: floppy Dec 6 04:39:05 localhost nova_compute[187174]: lun Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: ide Dec 6 04:39:05 localhost nova_compute[187174]: fdc Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: sata Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: vnc Dec 6 04:39:05 localhost nova_compute[187174]: egl-headless Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: subsystem Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: mandatory Dec 6 04:39:05 localhost nova_compute[187174]: requisite Dec 6 04:39:05 localhost nova_compute[187174]: optional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: pci Dec 6 04:39:05 localhost nova_compute[187174]: scsi Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: virtio Dec 6 04:39:05 localhost nova_compute[187174]: virtio-transitional Dec 6 04:39:05 localhost nova_compute[187174]: virtio-non-transitional Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: random Dec 6 04:39:05 localhost nova_compute[187174]: egd Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: path Dec 6 04:39:05 localhost nova_compute[187174]: handle Dec 6 04:39:05 localhost nova_compute[187174]: virtiofs Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tpm-tis Dec 6 04:39:05 localhost nova_compute[187174]: tpm-crb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: emulator Dec 6 04:39:05 localhost nova_compute[187174]: external Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 2.0 Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: usb Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: qemu Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: builtin Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: default Dec 6 04:39:05 localhost nova_compute[187174]: passt Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: isa Dec 6 04:39:05 localhost nova_compute[187174]: hyperv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: null Dec 6 04:39:05 localhost nova_compute[187174]: vc Dec 6 04:39:05 localhost nova_compute[187174]: pty Dec 6 04:39:05 localhost nova_compute[187174]: dev Dec 6 04:39:05 localhost nova_compute[187174]: file Dec 6 04:39:05 localhost nova_compute[187174]: pipe Dec 6 04:39:05 localhost nova_compute[187174]: stdio Dec 6 04:39:05 localhost nova_compute[187174]: udp Dec 6 04:39:05 localhost nova_compute[187174]: tcp Dec 6 04:39:05 localhost nova_compute[187174]: unix Dec 6 04:39:05 localhost nova_compute[187174]: qemu-vdagent Dec 6 04:39:05 localhost nova_compute[187174]: dbus Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: relaxed Dec 6 04:39:05 localhost nova_compute[187174]: vapic Dec 6 04:39:05 localhost nova_compute[187174]: spinlocks Dec 6 04:39:05 localhost nova_compute[187174]: vpindex Dec 6 04:39:05 localhost nova_compute[187174]: runtime Dec 6 04:39:05 localhost nova_compute[187174]: synic Dec 6 04:39:05 localhost nova_compute[187174]: stimer Dec 6 04:39:05 localhost nova_compute[187174]: reset Dec 6 04:39:05 localhost nova_compute[187174]: vendor_id Dec 6 04:39:05 localhost nova_compute[187174]: frequencies Dec 6 04:39:05 localhost nova_compute[187174]: reenlightenment Dec 6 04:39:05 localhost nova_compute[187174]: tlbflush Dec 6 04:39:05 localhost nova_compute[187174]: ipi Dec 6 04:39:05 localhost nova_compute[187174]: avic Dec 6 04:39:05 localhost nova_compute[187174]: emsr_bitmap Dec 6 04:39:05 localhost nova_compute[187174]: xmm_input Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: 4095 Dec 6 04:39:05 localhost nova_compute[187174]: on Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: off Dec 6 04:39:05 localhost nova_compute[187174]: Linux KVM Hv Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: tdx Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: Dec 6 04:39:05 localhost nova_compute[187174]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.868 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.869 187178 INFO nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Secure Boot support detected#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.872 187178 INFO nova.virt.libvirt.driver [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.872 187178 INFO nova.virt.libvirt.driver [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.885 187178 DEBUG nova.virt.libvirt.driver [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.908 187178 INFO nova.virt.node [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Determined node identity db8b39ad-af52-43e3-99e2-f3c431f03241 from /var/lib/nova/compute_id#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.925 187178 DEBUG nova.compute.manager [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Verified node db8b39ad-af52-43e3-99e2-f3c431f03241 matches my host np0005548798.ooo.test _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.953 187178 DEBUG nova.compute.manager [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.958 187178 DEBUG nova.virt.libvirt.vif [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:38:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005548798.ooo.test',hostname='test',id=2,image_ref='c6562616-bf77-48e6-bb05-431e64af083a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T08:38:42Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005548798.ooo.test',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='47835b89168945138751a4b216280589',ramdisk_id='',reservation_id='r-h8mij0z5',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-12-06T08:38:43Z,user_data=None,user_id='5220ceda9e4145d395f52fc9fd0365c0',uuid=a5070ada-6b60-4992-a1bf-9e83aaccac93,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.958 187178 DEBUG nova.network.os_vif_util [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Converting VIF {"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.959 187178 DEBUG nova.network.os_vif_util [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.959 187178 DEBUG os_vif [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.993 187178 DEBUG ovsdbapp.backend.ovs_idl [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.993 187178 DEBUG ovsdbapp.backend.ovs_idl [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.994 187178 DEBUG ovsdbapp.backend.ovs_idl [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.994 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.995 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.995 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.995 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.996 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:05 localhost nova_compute[187174]: 2025-12-06 09:39:05.999 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.016 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.016 187178 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.017 187178 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.018 187178 INFO oslo.privsep.daemon [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmppo8ndpt8/privsep.sock']#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.653 187178 INFO oslo.privsep.daemon [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.540 187233 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.545 187233 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.549 187233 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.549 187233 INFO oslo.privsep.daemon [-] privsep daemon running as pid 187233#033[00m Dec 6 04:39:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:39:06.658 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:39:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:39:06.659 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:39:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:39:06.660 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.887 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.985 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.985 187178 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap227fe5b2-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.986 187178 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap227fe5b2-a5, col_values=(('external_ids', {'iface-id': '227fe5b2-a5a7-4043-b641-32b6e7c7a7c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:02:64', 'vm-uuid': 'a5070ada-6b60-4992-a1bf-9e83aaccac93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.987 187178 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.987 187178 INFO os_vif [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5')#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.988 187178 DEBUG nova.compute.manager [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.992 187178 DEBUG nova.compute.manager [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Dec 6 04:39:06 localhost nova_compute[187174]: 2025-12-06 09:39:06.992 187178 INFO nova.compute.manager [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.087 187178 DEBUG oslo_concurrency.lockutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.087 187178 DEBUG oslo_concurrency.lockutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.088 187178 DEBUG oslo_concurrency.lockutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.088 187178 DEBUG nova.compute.resource_tracker [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.149 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.207 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.208 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:39:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21985 DF PROTO=TCP SPT=42970 DPT=9100 SEQ=1064973763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5C3870000000001030307) Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.284 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.285 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.354 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.355 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.406 187178 DEBUG oslo_concurrency.processutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.629 187178 WARNING nova.virt.libvirt.driver [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.631 187178 DEBUG nova.compute.resource_tracker [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=13322MB free_disk=387.46550369262695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.631 187178 DEBUG oslo_concurrency.lockutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.632 187178 DEBUG oslo_concurrency.lockutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.768 187178 DEBUG nova.compute.resource_tracker [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.769 187178 DEBUG nova.compute.resource_tracker [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.769 187178 DEBUG nova.compute.resource_tracker [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.841 187178 DEBUG nova.scheduler.client.report [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.869 187178 DEBUG nova.scheduler.client.report [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.870 187178 DEBUG nova.compute.provider_tree [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.889 187178 DEBUG nova.scheduler.client.report [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.910 187178 DEBUG nova.scheduler.client.report [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.945 187178 DEBUG nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Dec 6 04:39:07 localhost nova_compute[187174]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.946 187178 INFO nova.virt.libvirt.host [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] kernel doesn't support AMD SEV#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.947 187178 DEBUG nova.compute.provider_tree [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.948 187178 DEBUG nova.virt.libvirt.driver [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.967 187178 DEBUG nova.scheduler.client.report [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.989 187178 DEBUG nova.compute.resource_tracker [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.990 187178 DEBUG oslo_concurrency.lockutils [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:39:07 localhost nova_compute[187174]: 2025-12-06 09:39:07.990 187178 DEBUG nova.service [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Dec 6 04:39:08 localhost nova_compute[187174]: 2025-12-06 09:39:08.036 187178 DEBUG nova.service [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Dec 6 04:39:08 localhost nova_compute[187174]: 2025-12-06 09:39:08.036 187178 DEBUG nova.servicegroup.drivers.db [None req-7c8b93bb-8b65-478d-bef7-9afcba2838e0 - - - - - -] DB_Driver: join new ServiceGroup member np0005548798.ooo.test to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Dec 6 04:39:09 localhost python3.9[187342]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 6 04:39:10 localhost systemd[1]: Started libpod-conmon-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00.scope. Dec 6 04:39:10 localhost systemd[1]: Started libcrun container. Dec 6 04:39:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Dec 6 04:39:10 localhost podman[187367]: 2025-12-06 09:39:10.0704297 +0000 UTC m=+0.135622688 container init 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init) Dec 6 04:39:10 localhost podman[187367]: 2025-12-06 09:39:10.083313062 +0000 UTC m=+0.148506110 container start 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 04:39:10 localhost python3.9[187342]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Applying nova statedir ownership Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93 already 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93 to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.info Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/console.log Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/3e070c3db7ba7309de3805d58aaf4369c4bd45c2 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-3e070c3db7ba7309de3805d58aaf4369c4bd45c2 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673 Dec 6 04:39:10 localhost nova_compute_init[187388]: INFO:nova_statedir:Nova statedir ownership complete Dec 6 04:39:10 localhost systemd[1]: libpod-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00.scope: Deactivated successfully. Dec 6 04:39:10 localhost podman[187389]: 2025-12-06 09:39:10.162549048 +0000 UTC m=+0.061157962 container died 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:39:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34479 DF PROTO=TCP SPT=34788 DPT=9100 SEQ=121480631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5CF080000000001030307) Dec 6 04:39:10 localhost podman[187400]: 2025-12-06 09:39:10.249908556 +0000 UTC m=+0.083283322 container cleanup 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:39:10 localhost systemd[1]: libpod-conmon-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00.scope: Deactivated successfully. Dec 6 04:39:10 localhost systemd[1]: tmp-crun.eUmqAG.mount: Deactivated successfully. Dec 6 04:39:10 localhost systemd[1]: var-lib-containers-storage-overlay-921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4-merged.mount: Deactivated successfully. Dec 6 04:39:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00-userdata-shm.mount: Deactivated successfully. Dec 6 04:39:11 localhost nova_compute[187174]: 2025-12-06 09:39:10.999 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:11 localhost systemd-logind[760]: Session 39 logged out. Waiting for processes to exit. Dec 6 04:39:11 localhost systemd[1]: session-39.scope: Deactivated successfully. Dec 6 04:39:11 localhost systemd[1]: session-39.scope: Consumed 2min 13.483s CPU time. Dec 6 04:39:11 localhost systemd-logind[760]: Removed session 39. Dec 6 04:39:11 localhost nova_compute[187174]: 2025-12-06 09:39:11.937 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:39:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:39:13 localhost podman[187444]: 2025-12-06 09:39:13.548581072 +0000 UTC m=+0.085272685 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:39:13 localhost systemd[1]: tmp-crun.mAffak.mount: Deactivated successfully. Dec 6 04:39:13 localhost podman[187444]: 2025-12-06 09:39:13.641309548 +0000 UTC m=+0.178001111 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Dec 6 04:39:13 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:39:13 localhost podman[187445]: 2025-12-06 09:39:13.641808505 +0000 UTC m=+0.171767738 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 6 04:39:13 localhost podman[187445]: 2025-12-06 09:39:13.726259383 +0000 UTC m=+0.256218646 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:39:13 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:39:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44605 DF PROTO=TCP SPT=53438 DPT=9101 SEQ=2163712551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5E3870000000001030307) Dec 6 04:39:16 localhost nova_compute[187174]: 2025-12-06 09:39:16.002 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25908 DF PROTO=TCP SPT=58314 DPT=9105 SEQ=481859652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5E6650000000001030307) Dec 6 04:39:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:39:16 localhost podman[187487]: 2025-12-06 09:39:16.551977903 +0000 UTC m=+0.077288394 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 04:39:16 localhost podman[187487]: 2025-12-06 09:39:16.591171368 +0000 UTC m=+0.116481849 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:39:16 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:39:16 localhost nova_compute[187174]: 2025-12-06 09:39:16.969 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:18 localhost sshd[187503]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:39:19 localhost systemd-logind[760]: New session 41 of user zuul. Dec 6 04:39:19 localhost systemd[1]: Started Session 41 of User zuul. Dec 6 04:39:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25910 DF PROTO=TCP SPT=58314 DPT=9105 SEQ=481859652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB5F2870000000001030307) Dec 6 04:39:20 localhost python3.9[187614]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:39:21 localhost nova_compute[187174]: 2025-12-06 09:39:21.004 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:21 localhost python3.9[187728]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:39:21 localhost systemd[1]: Reloading. Dec 6 04:39:21 localhost systemd-sysv-generator[187760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:39:21 localhost systemd-rc-local-generator[187755]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:22 localhost nova_compute[187174]: 2025-12-06 09:39:22.008 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:22 localhost python3.9[187873]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:39:22 localhost network[187890]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:39:22 localhost network[187891]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:39:22 localhost network[187892]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:39:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39896 DF PROTO=TCP SPT=60040 DPT=9882 SEQ=3068690939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6022F0000000001030307) Dec 6 04:39:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:39:25 localhost nova_compute[187174]: 2025-12-06 09:39:25.039 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:39:25 localhost nova_compute[187174]: 2025-12-06 09:39:25.074 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Triggering sync for uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 6 04:39:25 localhost nova_compute[187174]: 2025-12-06 09:39:25.075 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:39:25 localhost nova_compute[187174]: 2025-12-06 09:39:25.075 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:39:25 localhost nova_compute[187174]: 2025-12-06 09:39:25.075 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:39:25 localhost nova_compute[187174]: 2025-12-06 09:39:25.179 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:39:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12273 DF PROTO=TCP SPT=48368 DPT=9102 SEQ=337986673 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB60B870000000001030307) Dec 6 04:39:26 localhost nova_compute[187174]: 2025-12-06 09:39:26.007 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:27 localhost nova_compute[187174]: 2025-12-06 09:39:27.063 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:27 localhost python3.9[188128]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:39:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7292 DF PROTO=TCP SPT=52832 DPT=9102 SEQ=1463454176 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB615880000000001030307) Dec 6 04:39:28 localhost python3.9[188239]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:28 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 76.6 (255 of 333 items), suggesting rotation. Dec 6 04:39:28 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 04:39:28 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:39:28 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:39:29 localhost python3.9[188350]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:30 localhost python3.9[188460]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:39:31 localhost nova_compute[187174]: 2025-12-06 09:39:31.009 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:31 localhost python3.9[188570]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:39:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53682 DF PROTO=TCP SPT=51244 DPT=9101 SEQ=2553058549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB621470000000001030307) Dec 6 04:39:32 localhost nova_compute[187174]: 2025-12-06 09:39:32.113 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:32 localhost python3.9[188680]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:39:32 localhost systemd[1]: Reloading. Dec 6 04:39:32 localhost systemd-rc-local-generator[188701]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:39:32 localhost systemd-sysv-generator[188706]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:32 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:39:33 localhost python3.9[188825]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:39:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55718 DF PROTO=TCP SPT=37250 DPT=9101 SEQ=608649544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB62D870000000001030307) Dec 6 04:39:34 localhost python3.9[188936]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:39:35 localhost python3.9[189044]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:39:36 localhost nova_compute[187174]: 2025-12-06 09:39:36.011 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:36 localhost python3.9[189154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:36 localhost python3.9[189240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013975.8153586-359-237242677665189/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=46d559ffa6cfc47f81073d8aa5d41ff97b90af2f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:39:37 localhost nova_compute[187174]: 2025-12-06 09:39:37.150 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53684 DF PROTO=TCP SPT=51244 DPT=9101 SEQ=2553058549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB639070000000001030307) Dec 6 04:39:37 localhost python3.9[189350]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None Dec 6 04:39:38 localhost python3.9[189460]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None Dec 6 04:39:39 localhost python3.9[189571]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 6 04:39:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18061 DF PROTO=TCP SPT=40084 DPT=9100 SEQ=2492014903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB644470000000001030307) Dec 6 04:39:41 localhost nova_compute[187174]: 2025-12-06 09:39:41.015 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:41 localhost python3.9[189687]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548798.ooo.test update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Dec 6 04:39:42 localhost nova_compute[187174]: 2025-12-06 09:39:42.197 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:43 localhost python3.9[189803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:43 localhost python3.9[189889]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765013982.2298281-563-73312739040381/.source.conf _original_basename=ceilometer.conf follow=False checksum=241428cffd9832ac5e5e746a2bee59180f855823 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:44 localhost python3.9[189997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:39:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:39:44 localhost podman[189998]: 2025-12-06 09:39:44.565831147 +0000 UTC m=+0.096436573 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 04:39:44 localhost podman[189998]: 2025-12-06 09:39:44.580511366 +0000 UTC m=+0.111116812 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=multipathd) Dec 6 04:39:44 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:39:44 localhost podman[190000]: 2025-12-06 09:39:44.666324316 +0000 UTC m=+0.196993064 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:39:44 localhost podman[190000]: 2025-12-06 09:39:44.754308025 +0000 UTC m=+0.284976773 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS) Dec 6 04:39:44 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:39:44 localhost python3.9[190126]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765013984.0211046-563-270186575156019/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:45 localhost python3.9[190234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53685 DF PROTO=TCP SPT=51244 DPT=9101 SEQ=2553058549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB659870000000001030307) Dec 6 04:39:45 localhost python3.9[190320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765013985.05884-563-151905494271553/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:46 localhost nova_compute[187174]: 2025-12-06 09:39:46.018 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60591 DF PROTO=TCP SPT=57258 DPT=9105 SEQ=3571286738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB65B960000000001030307) Dec 6 04:39:46 localhost python3.9[190428]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:39:47 localhost python3.9[190538]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:39:47 localhost nova_compute[187174]: 2025-12-06 09:39:47.240 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:39:47 localhost podman[190597]: 2025-12-06 09:39:47.548005816 +0000 UTC m=+0.081416894 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent) Dec 6 04:39:47 localhost podman[190597]: 2025-12-06 09:39:47.583191255 +0000 UTC m=+0.116602363 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:39:47 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:39:47 localhost python3.9[190664]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:48 localhost python3.9[190750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013987.3636062-740-25939814279538/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:48 localhost python3.9[190858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:49 localhost python3.9[190913]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60593 DF PROTO=TCP SPT=57258 DPT=9105 SEQ=3571286738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB667880000000001030307) Dec 6 04:39:49 localhost python3.9[191021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:50 localhost python3.9[191107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013989.3672328-740-137410179095109/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:50 localhost python3.9[191215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:51 localhost nova_compute[187174]: 2025-12-06 09:39:51.020 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:51 localhost python3.9[191301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013990.4644742-740-262785378984550/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:52 localhost nova_compute[187174]: 2025-12-06 09:39:52.242 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:52 localhost python3.9[191409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:52 localhost python3.9[191495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013992.0792289-740-182830888204477/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60594 DF PROTO=TCP SPT=57258 DPT=9105 SEQ=3571286738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB677470000000001030307) Dec 6 04:39:53 localhost python3.9[191603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:54 localhost python3.9[191689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013993.0876465-740-227718284069355/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=3820eb6e48c35431ebf53228213a5d51b7591223 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:55 localhost python3.9[191797]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:55 localhost python3.9[191883]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013994.69184-740-22068671656888/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63803 DF PROTO=TCP SPT=48090 DPT=9102 SEQ=3664988454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB681880000000001030307) Dec 6 04:39:56 localhost nova_compute[187174]: 2025-12-06 09:39:56.064 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:56 localhost python3.9[191991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:56 localhost python3.9[192077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013995.688477-740-183768085685939/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=33df3bf08923ad9105770f5abb51d4cde791931a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:57 localhost python3.9[192185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:57 localhost nova_compute[187174]: 2025-12-06 09:39:57.277 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:39:57 localhost python3.9[192271]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013996.6799498-740-132334887483786/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:58 localhost python3.9[192379]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:58 localhost python3.9[192465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013997.7607436-740-19678047324673/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=8bed8129af2c9145e8d37569bb493c0de1895d6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:39:59 localhost python3.9[192573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:39:59 localhost python3.9[192659]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013998.847266-740-46139731734969/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26952 DF PROTO=TCP SPT=40440 DPT=9101 SEQ=96384228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB692670000000001030307) Dec 6 04:40:00 localhost python3.9[192767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:40:00 localhost python3.9[192822]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:01 localhost nova_compute[187174]: 2025-12-06 09:40:01.099 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26953 DF PROTO=TCP SPT=40440 DPT=9101 SEQ=96384228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB696870000000001030307) Dec 6 04:40:02 localhost nova_compute[187174]: 2025-12-06 09:40:02.315 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:02 localhost python3.9[192930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:40:02 localhost python3.9[192985]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:04 localhost python3.9[193093]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:40:04 localhost python3.9[193148]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:04 localhost nova_compute[187174]: 2025-12-06 09:40:04.912 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:04 localhost nova_compute[187174]: 2025-12-06 09:40:04.913 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:04 localhost nova_compute[187174]: 2025-12-06 09:40:04.913 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:40:04 localhost nova_compute[187174]: 2025-12-06 09:40:04.913 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:40:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18063 DF PROTO=TCP SPT=40084 DPT=9100 SEQ=2492014903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6A5870000000001030307) Dec 6 04:40:06 localhost python3.9[193258]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:06 localhost nova_compute[187174]: 2025-12-06 09:40:06.154 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:06 localhost nova_compute[187174]: 2025-12-06 09:40:06.517 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:40:06 localhost nova_compute[187174]: 2025-12-06 09:40:06.517 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:40:06 localhost nova_compute[187174]: 2025-12-06 09:40:06.518 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:40:06 localhost nova_compute[187174]: 2025-12-06 09:40:06.518 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:40:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:40:06.659 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:40:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:40:06.660 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:40:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:40:06.661 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:40:06 localhost python3.9[193368]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:07 localhost nova_compute[187174]: 2025-12-06 09:40:07.352 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26955 DF PROTO=TCP SPT=40440 DPT=9101 SEQ=96384228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6AE470000000001030307) Dec 6 04:40:07 localhost python3.9[193478]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:40:08 localhost python3.9[193588]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:40:08 localhost systemd[1]: Reloading. Dec 6 04:40:08 localhost systemd-rc-local-generator[193612]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:40:08 localhost systemd-sysv-generator[193615]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:08 localhost systemd[1]: Listening on Podman API Socket. Dec 6 04:40:09 localhost python3.9[193738]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.837 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.858 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.858 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.859 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.860 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.860 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.861 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.861 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.862 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.862 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.863 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.879 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.880 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.880 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.881 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:40:09 localhost nova_compute[187174]: 2025-12-06 09:40:09.940 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:40:09 localhost python3.9[193826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014008.9894469-1406-252042144244458/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.025 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.026 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.084 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.085 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.164 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.165 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.228 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:40:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9621 DF PROTO=TCP SPT=50440 DPT=9100 SEQ=2491269397 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6B9870000000001030307) Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.435 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:40:10 localhost python3.9[193891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.438 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=13288MB free_disk=387.46475982666016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.439 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.440 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.503 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.503 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.504 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.540 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.572 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.575 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:40:10 localhost nova_compute[187174]: 2025-12-06 09:40:10.575 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:40:10 localhost python3.9[193981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014008.9894469-1406-252042144244458/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:40:11 localhost nova_compute[187174]: 2025-12-06 09:40:11.177 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:12 localhost python3.9[194091]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False Dec 6 04:40:12 localhost nova_compute[187174]: 2025-12-06 09:40:12.388 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:13 localhost python3.9[194201]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:40:14 localhost python3[194311]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:40:14 localhost python3[194311]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2",#012 "Digest": "sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-01T06:21:53.58682213Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 505175293,#012 "VirtualSize": 505175293,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",#012 "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",#012 "sha256:a47016624274f5ebad76019f5a2e465c1737f96caa539b36f90ab8e33592f415",#012 "sha256:38a03f5e96658211fb28e2f87c11ffad531281d1797368f48e6cd4af7ac97c0e"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-01T06:09:28.025707917Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025744608Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025767729Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025791379Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.02581523Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025867611Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.469442331Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:02.029095017Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 Dec 6 04:40:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:40:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:40:15 localhost podman[194362]: 2025-12-06 09:40:15.065967946 +0000 UTC m=+0.110880421 container remove b5fdf2732cc41661ecf0a85097225437e7e9f0ff2dc12e4b2c3180420c30ab2d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52b849225b4338d04445dda705a9a8bc'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/ipa/ca.crt:/etc/ipa/ca.crt:ro', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Dec 6 04:40:15 localhost python3[194311]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute Dec 6 04:40:15 localhost podman[194375]: 2025-12-06 09:40:15.129220205 +0000 UTC m=+0.089217868 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 04:40:15 localhost podman[194375]: 2025-12-06 09:40:15.172219278 +0000 UTC m=+0.132217001 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:40:15 localhost systemd[1]: tmp-crun.3F5w7W.mount: Deactivated successfully. Dec 6 04:40:15 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:40:15 localhost podman[194374]: 2025-12-06 09:40:15.187752514 +0000 UTC m=+0.148423659 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:40:15 localhost podman[194374]: 2025-12-06 09:40:15.197780494 +0000 UTC m=+0.158451669 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:40:15 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:40:15 localhost podman[194398]: Dec 6 04:40:15 localhost podman[194398]: 2025-12-06 09:40:15.236814151 +0000 UTC m=+0.148151661 container create e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 04:40:15 localhost podman[194398]: 2025-12-06 09:40:15.192291139 +0000 UTC m=+0.103628689 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Dec 6 04:40:15 localhost python3[194311]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start Dec 6 04:40:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26956 DF PROTO=TCP SPT=40440 DPT=9101 SEQ=96384228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6CF870000000001030307) Dec 6 04:40:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53764 DF PROTO=TCP SPT=33636 DPT=9105 SEQ=1496433788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6D0C50000000001030307) Dec 6 04:40:16 localhost nova_compute[187174]: 2025-12-06 09:40:16.206 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:16 localhost python3.9[194565]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:40:17 localhost nova_compute[187174]: 2025-12-06 09:40:17.438 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:17 localhost python3.9[194677]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:40:18 localhost podman[194787]: 2025-12-06 09:40:18.273809184 +0000 UTC m=+0.097320278 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent) Dec 6 04:40:18 localhost podman[194787]: 2025-12-06 09:40:18.283816613 +0000 UTC m=+0.107327677 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:40:18 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:40:18 localhost python3.9[194786]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014017.5389152-1598-246259326453486/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:19 localhost python3.9[194860]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:40:19 localhost systemd[1]: Reloading. Dec 6 04:40:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53766 DF PROTO=TCP SPT=33636 DPT=9105 SEQ=1496433788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6DCC80000000001030307) Dec 6 04:40:19 localhost systemd-rc-local-generator[194882]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:40:19 localhost systemd-sysv-generator[194889]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost python3.9[194950]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:40:20 localhost systemd[1]: Reloading. Dec 6 04:40:20 localhost systemd-rc-local-generator[194981]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:40:20 localhost systemd-sysv-generator[194985]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:20 localhost systemd[1]: Starting ceilometer_agent_compute container... Dec 6 04:40:20 localhost systemd[1]: tmp-crun.wSSTlM.mount: Deactivated successfully. Dec 6 04:40:20 localhost systemd[1]: Started libcrun container. Dec 6 04:40:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:40:20 localhost podman[194992]: 2025-12-06 09:40:20.663712399 +0000 UTC m=+0.140193926 container init e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute) Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + sudo -E kolla_set_configs Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: sudo: unable to send audit message: Operation not permitted Dec 6 04:40:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:40:20 localhost podman[194992]: 2025-12-06 09:40:20.698413866 +0000 UTC m=+0.174895353 container start e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Dec 6 04:40:20 localhost podman[194992]: ceilometer_agent_compute Dec 6 04:40:20 localhost systemd[1]: Started ceilometer_agent_compute container. Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Validating config file Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Copying service configuration files Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: INFO:__main__:Writing out command to execute Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: ++ cat /run_command Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + ARGS= Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + sudo kolla_copy_cacerts Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: sudo: unable to send audit message: Operation not permitted Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + [[ ! -n '' ]] Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + . kolla_extend_start Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + umask 0022 Dec 6 04:40:20 localhost ceilometer_agent_compute[195008]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Dec 6 04:40:20 localhost podman[195016]: 2025-12-06 09:40:20.799120441 +0000 UTC m=+0.094134045 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:40:20 localhost podman[195016]: 2025-12-06 09:40:20.830390689 +0000 UTC m=+0.125404333 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125) Dec 6 04:40:20 localhost podman[195016]: unhealthy Dec 6 04:40:20 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:40:20 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:40:21 localhost nova_compute[187174]: 2025-12-06 09:40:21.245 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:21 localhost python3.9[195148]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:40:21 localhost systemd[1]: Stopping ceilometer_agent_compute container... Dec 6 04:40:21 localhost systemd[1]: libpod-e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.scope: Deactivated successfully. Dec 6 04:40:21 localhost podman[195152]: 2025-12-06 09:40:21.56370489 +0000 UTC m=+0.077599888 container died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:40:21 localhost systemd[1]: tmp-crun.CiD3gK.mount: Deactivated successfully. Dec 6 04:40:21 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.timer: Deactivated successfully. Dec 6 04:40:21 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:40:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509-userdata-shm.mount: Deactivated successfully. Dec 6 04:40:21 localhost systemd[1]: var-lib-containers-storage-overlay-aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe-merged.mount: Deactivated successfully. Dec 6 04:40:21 localhost podman[195152]: 2025-12-06 09:40:21.677771972 +0000 UTC m=+0.191666910 container cleanup e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 04:40:21 localhost podman[195152]: ceilometer_agent_compute Dec 6 04:40:21 localhost podman[195178]: 2025-12-06 09:40:21.780947066 +0000 UTC m=+0.069454269 container cleanup e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:40:21 localhost podman[195178]: ceilometer_agent_compute Dec 6 04:40:21 localhost systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully. Dec 6 04:40:21 localhost systemd[1]: Stopped ceilometer_agent_compute container. Dec 6 04:40:21 localhost systemd[1]: Starting ceilometer_agent_compute container... Dec 6 04:40:21 localhost systemd[1]: Started libcrun container. Dec 6 04:40:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaba909806d03cf7de7e4e079b2d587536c445d5dafabe2cc4dd867f057d6cfe/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:40:21 localhost podman[195191]: 2025-12-06 09:40:21.943769963 +0000 UTC m=+0.127125189 container init e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 04:40:21 localhost ceilometer_agent_compute[195206]: + sudo -E kolla_set_configs Dec 6 04:40:21 localhost ceilometer_agent_compute[195206]: sudo: unable to send audit message: Operation not permitted Dec 6 04:40:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:40:21 localhost podman[195191]: 2025-12-06 09:40:21.978957036 +0000 UTC m=+0.162312232 container start e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:40:21 localhost podman[195191]: ceilometer_agent_compute Dec 6 04:40:21 localhost systemd[1]: Started ceilometer_agent_compute container. Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Validating config file Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Copying service configuration files Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: INFO:__main__:Writing out command to execute Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: ++ cat /run_command Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + ARGS= Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + sudo kolla_copy_cacerts Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: sudo: unable to send audit message: Operation not permitted Dec 6 04:40:22 localhost podman[195214]: 2025-12-06 09:40:22.070504589 +0000 UTC m=+0.095089337 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + [[ ! -n '' ]] Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + . kolla_extend_start Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + umask 0022 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Dec 6 04:40:22 localhost podman[195214]: 2025-12-06 09:40:22.10311458 +0000 UTC m=+0.127699278 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 6 04:40:22 localhost podman[195214]: unhealthy Dec 6 04:40:22 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:40:22 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:40:22 localhost nova_compute[187174]: 2025-12-06 09:40:22.478 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.765 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.765 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.765 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.765 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.766 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.767 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.768 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.769 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.770 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.771 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.772 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.773 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.774 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.775 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.776 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.777 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.778 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.779 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.780 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.781 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.781 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.781 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.781 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.781 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.796 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.797 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.798 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.875 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Dec 6 04:40:22 localhost python3.9[195346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.946 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.946 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.946 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.946 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.946 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.946 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.946 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.947 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.948 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.949 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.950 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.951 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.952 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.954 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.954 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.954 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.954 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.954 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.954 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.954 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.955 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.955 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.955 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.955 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.956 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.957 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.958 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.959 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.960 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.961 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.962 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.963 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.964 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.966 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.967 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.968 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.969 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.969 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.971 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Dec 6 04:40:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:22.978 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Dec 6 04:40:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53767 DF PROTO=TCP SPT=33636 DPT=9105 SEQ=1496433788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6EC870000000001030307) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.421 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}6311fe04d353fd4565b23bf2c9792dd7b1cd852e9e5963f6a5b1381a1450bded" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Dec 6 04:40:23 localhost python3.9[195440]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014022.4329202-1694-256912859055811/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.597 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 329 Content-Type: application/json Date: Sat, 06 Dec 2025 09:40:23 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a1cd597c-df27-470f-befe-440ca3a8b542 x-openstack-request-id: req-a1cd597c-df27-470f-befe-440ca3a8b542 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.597 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "7a18e612-6562-4812-b07b-d906254f72f4", "name": "m1.small", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/7a18e612-6562-4812-b07b-d906254f72f4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/7a18e612-6562-4812-b07b-d906254f72f4"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.597 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-a1cd597c-df27-470f-befe-440ca3a8b542 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.598 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/7a18e612-6562-4812-b07b-d906254f72f4 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}6311fe04d353fd4565b23bf2c9792dd7b1cd852e9e5963f6a5b1381a1450bded" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.729 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Sat, 06 Dec 2025 09:40:23 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d9185766-c6f7-4b72-ab1e-22cef8043c0e x-openstack-request-id: req-d9185766-c6f7-4b72-ab1e-22cef8043c0e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.729 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "7a18e612-6562-4812-b07b-d906254f72f4", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/7a18e612-6562-4812-b07b-d906254f72f4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/7a18e612-6562-4812-b07b-d906254f72f4"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.729 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/7a18e612-6562-4812-b07b-d906254f72f4 used request id req-d9185766-c6f7-4b72-ab1e-22cef8043c0e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.730 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.731 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.781 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 301237008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.782 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 37411248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6082df61-dbb0-4ac0-90d1-fa339e9b5339', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 301237008, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.731497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '969db178-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '6a2de1357d23d7872073542cc06d8c319234ca86eaf174e45248081798894615'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37411248, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.731497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '969dd072-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '3104c834d5ef4ed100c2c774fc6b7da2ec0393f347a4281aff0985e9e2112a0f'}]}, 'timestamp': '2025-12-06 09:40:23.783169', '_unique_id': '5fc771369016492aaa91ff251dc02d07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.790 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.814 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31064064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.815 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b3c6a29-1a20-4055-8aad-17c3bd0568ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31064064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.794909', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96a2aea8-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.005197366, 'message_signature': 'b50ca322943d2126fce7ec912336229593826e06bc92eab71f4b7abdc121cf67'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.794909', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96a2c898-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.005197366, 'message_signature': 'e11413ba92e0b659759ddeffa51e0184c94b9a5ad5d900a3182ef99afa4b1046'}]}, 'timestamp': '2025-12-06 09:40:23.815621', '_unique_id': '8c2aaba59f21472ca20dd7cf5649ce37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.817 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.818 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.819 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e62a722-54e8-492d-bc23-1a9aa682a17e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.818499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96a34bb0-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.005197366, 'message_signature': '07c8b89088e4b1150fb38d9df778b521cac922aed30b52e273f1de2388d97c1c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.818499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96a36064-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.005197366, 'message_signature': '7394cca51693df674ec7abb165712ece6230775dba05d6a60829231253b38508'}]}, 'timestamp': '2025-12-06 09:40:23.819499', '_unique_id': '1508d1bb9f334bb080bfdeebf50c2e2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.820 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.826 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a5070ada-6b60-4992-a1bf-9e83aaccac93 / tap227fe5b2-a5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.826 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0e4d0bf-6b2c-4af7-a069-156c263248d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.822156', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96a49182-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': 'fd854be720a1cb6447f6d4ee23638432e8cd7080c413aff4c9151842347e32f7'}]}, 'timestamp': '2025-12-06 09:40:23.827453', '_unique_id': '4a218eb3ac5a4955a77a7f55c7cff756'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.828 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.830 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.830 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.830 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.831 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.831 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6d6180d-0d43-4db4-a876-d2258425dfb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.831241', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96a53dc6-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': 'a0ad0258d5d6010e8996557142b2362d7b6f32c295515a4cd55dc35f6822bacc'}]}, 'timestamp': '2025-12-06 09:40:23.831757', '_unique_id': 'e4e360f5c6e3406a9f2697568c48ac8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.832 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.833 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.834 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 10762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97476b78-bf35-411d-a13f-58f7d1e12a51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10762, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.834094', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96a5acde-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': '96ebb3b4bdf9ac732b55b1d04ba38317cd69f86823135234b0be213bbf589721'}]}, 'timestamp': '2025-12-06 09:40:23.834591', '_unique_id': '81420d42d52e499fa8b5465bdfc0f91d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.835 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.836 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 96 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58e6e8f8-02ba-4bcb-936f-927f6237796b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 96, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.836872', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96a61994-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': '3178dad1816b036a5e3b0b99e3940e31b7763091290aabceafbc36cc2d6ebf48'}]}, 'timestamp': '2025-12-06 09:40:23.837375', '_unique_id': '703a13f76fd9444cb717b332533ad855'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.838 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.839 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.839 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.840 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ec2cf3c-1dd9-4c15-a8b2-4b4ee0a6c60a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.839596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96a6837a-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '036ad1c4f81ef99e9ec8bce5e747ffd5a44cdc8674337529fe8f8cd7d1251d88'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.839596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96a69676-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': 'd0a469c8d3ad4f435d943f16ac8b9acc2bf8e196b0d711bd319adbf5abdc6c5c'}]}, 'timestamp': '2025-12-06 09:40:23.840537', '_unique_id': '8bcc268b30154bbfbbbcc0eb1a0d72c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.841 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.842 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.843 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81d32ca5-8829-4664-ba2c-e78fb5573351', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.842871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96a703ae-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '04822d607cc7c760d6525ec4a8da3a8d37f7308ec081a2ebb1e15747a7376ef6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.842871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96a714ca-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': 'e356adb2f4554e4e9345c75fa76ee9681ad307888546ee94f11184ed562767a5'}]}, 'timestamp': '2025-12-06 09:40:23.843762', '_unique_id': '94f2bdf9df5846d49136725f2d5495b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.845 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.846 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.846 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15151381-282c-429c-b80b-69f8b0f132a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.846399', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96a78d6a-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': 'c08654363b3cc9747cb31550ed129edec5fcbdcabffcc1f9a5a0d2b87ccd51c4'}]}, 'timestamp': '2025-12-06 09:40:23.846915', '_unique_id': '25a36ed95b4a447b8cca2487ccd08590'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.848 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.849 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.877 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 50440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff13ad8f-d12c-48a4-9659-16b1fbee2209', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50440000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:40:23.849661', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '96ac4d1e-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.087227155, 'message_signature': 'dcebd7476612f912cd63338f7cacc7423cc75c53badd7cddc3a22b175420d11a'}]}, 'timestamp': '2025-12-06 09:40:23.878033', '_unique_id': 'edbc8bbac48545ca98372654fc7de8ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.879 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.880 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.880 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.880 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.880 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.881 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.881 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '888cfbcf-22ee-4988-b10f-9f5ed3cc60cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.881017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96acd572-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '3f4912eabfb64ba1af62b79373a80bbb0ddc59a29861cd24ecbfc76201a0b131'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.881017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96ace5ee-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '68f8aca7fe5d8d43bb93cc073960a09002a886e7ece0a83a8f9a077cd493a41f'}]}, 'timestamp': '2025-12-06 09:40:23.881906', '_unique_id': '30787c96a0384389a833763ed80abd2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.882 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.884 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.884 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27cbae77-968f-4f4b-8398-1ea6a6333250', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.884116', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96ad4e4e-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.005197366, 'message_signature': '77528352798e0b678614b0f62f2d9c0b675898c5f919dda56b2c32d89ed31645'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.884116', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96ad635c-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.005197366, 'message_signature': '22e689606d82debd6a337195d664021a350b0b96ea9756251a42f5a56c024cae'}]}, 'timestamp': '2025-12-06 09:40:23.885097', '_unique_id': 'f258b58e6f4f45e79d39634f0b718097'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.886 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.887 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 9699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be1621bd-8803-480f-b6b5-720ee6e9854f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9699, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.887284', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96adca18-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': '71613e27e6ecdb0a6b42edf2a11d8cf2e2ff518b431e5aabd44b7933a5b581a8'}]}, 'timestamp': '2025-12-06 09:40:23.887749', '_unique_id': '279f16485c904a5a9308b3193035d440'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.888 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.889 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.890 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.890 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c7d0674-4393-4fd2-a5a2-fb6033487a6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.890501', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96ae47fe-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': '9e1032577e5021fca32d149a16b0b81b933726f88b46528f859de6dae57bdca8'}]}, 'timestamp': '2025-12-06 09:40:23.891003', '_unique_id': '899ed02592c54421b8f7aec59a459122'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.891 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.893 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87704f9c-2237-400a-8d65-40430fdd66db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.893163', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96aeb018-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': '137b0adec4b0845186b641003cf8d2b15c7d58e5c86d4778efa2b3f1e9bfc503'}]}, 'timestamp': '2025-12-06 09:40:23.893638', '_unique_id': 'b6ec8412641e4a2fab16351556d8d129'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.894 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.895 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d8b8c41-9ac9-46e2-8edf-71c49f196760', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.895757', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96af1620-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': '4d68aac787db7f241e3c6c92fd4d76b7776c313b89ca3b6ee92d9e3d4b62f30f'}]}, 'timestamp': '2025-12-06 09:40:23.896249', '_unique_id': 'c4e3d8397d1e463ab4d0754b23f7f965'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.897 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.898 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6e36c18-3739-4a7d-884f-dfd61e41f6d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:40:23.898406', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '96af7cd2-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.032464777, 'message_signature': '4a730579bf34541d9f6b80609af1f7388b7126d9ede9c8d470588d5af250fc7b'}]}, 'timestamp': '2025-12-06 09:40:23.898917', '_unique_id': 'dae68adafb5d4b89a6cd4da7d3b3f0ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.899 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.901 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.901 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.901 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.901 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 73904128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.902 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd672e0e-2634-4228-834e-8674598c309b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73904128, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.901721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96afffcc-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '542b1fa9b2dd3091cd2c74f75613f6ba8a05f7fe628ccae5639004a93d1cbfb6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.901721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96b0108e-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '4fef6155a17d2ceb58e57187ba4eed3aa8fd63a16281ef35cd080b165f3f5499'}]}, 'timestamp': '2025-12-06 09:40:23.902632', '_unique_id': 'dd542505082a46189320e8051cdd2e56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.903 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.904 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.904 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21f07b2e-ece5-44b0-9021-c396ae87314d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:40:23.904890', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '96b079b6-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10411.087227155, 'message_signature': '193538a33dccc7276f084e385d71343beb6cae017d49d2202d4cd6619975cdd8'}]}, 'timestamp': '2025-12-06 09:40:23.905335', '_unique_id': 'c446ddfa408140e5ad506dd2655c972e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.906 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.907 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.907 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 947163713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.907 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 9516486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9384a784-21ba-43c2-ba75-10aefec3d32c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 947163713, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:40:23.907479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96b0de88-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '8cb8f836098d1fcfd3c2666fa328432e189d5c0fdf4a3a3be76be0e475c194a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9516486, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:40:23.907479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96b0f030-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10410.94172499, 'message_signature': '2ce33dbc821d671f97d040c451d735f7ee19c43a54fd9eb29588c4101bbc9cc2'}]}, 'timestamp': '2025-12-06 09:40:23.908386', '_unique_id': '243baaf3eb4a41e8a1c1a03463f4422f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:40:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:40:23.909 12 ERROR oslo_messaging.notify.messaging Dec 6 04:40:24 localhost python3.9[195550]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False Dec 6 04:40:25 localhost python3.9[195660]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:40:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54454 DF PROTO=TCP SPT=42944 DPT=9102 SEQ=2642433677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB6F5870000000001030307) Dec 6 04:40:26 localhost python3[195770]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:40:26 localhost nova_compute[187174]: 2025-12-06 09:40:26.277 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:26 localhost podman[195805]: Dec 6 04:40:26 localhost podman[195805]: 2025-12-06 09:40:26.462559841 +0000 UTC m=+0.071898097 container create 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors , config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible) Dec 6 04:40:26 localhost podman[195805]: 2025-12-06 09:40:26.424895009 +0000 UTC m=+0.034233335 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Dec 6 04:40:26 localhost python3[195770]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl Dec 6 04:40:27 localhost python3.9[195951]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:40:27 localhost nova_compute[187174]: 2025-12-06 09:40:27.520 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:40:28 localhost python3.9[196063]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:28 localhost python3.9[196172]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014028.1053488-1853-48753946025156/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:40:29 localhost python3.9[196227]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:40:29 localhost systemd[1]: Reloading. Dec 6 04:40:29 localhost systemd-rc-local-generator[196249]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:40:29 localhost systemd-sysv-generator[196255]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost python3.9[196318]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:40:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27615 DF PROTO=TCP SPT=53748 DPT=9101 SEQ=665186327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB707970000000001030307) Dec 6 04:40:30 localhost systemd[1]: Reloading. Dec 6 04:40:30 localhost systemd-rc-local-generator[196342]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:40:30 localhost systemd-sysv-generator[196349]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:40:30 localhost systemd[1]: Starting node_exporter container... Dec 6 04:40:30 localhost systemd[1]: Started libcrun container. Dec 6 04:40:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98f179d3d6d4fea2f6e4b4d2224d1ffbf1b8064c0989fa2f62cd74b073565823/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98f179d3d6d4fea2f6e4b4d2224d1ffbf1b8064c0989fa2f62cd74b073565823/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff) Dec 6 04:40:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:40:30 localhost podman[196359]: 2025-12-06 09:40:30.700769321 +0000 UTC m=+0.151042293 container init 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:40:30 localhost node_exporter[196372]: ts=2025-12-06T09:40:30.716Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)" Dec 6 04:40:30 localhost node_exporter[196372]: ts=2025-12-06T09:40:30.716Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)" Dec 6 04:40:30 localhost node_exporter[196372]: ts=2025-12-06T09:40:30.716Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required." Dec 6 04:40:30 localhost node_exporter[196372]: ts=2025-12-06T09:40:30.717Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$ Dec 6 04:42:07 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Dec 6 04:42:07 localhost systemd[1]: var-lib-containers-storage-overlay-4fbb5bd577b683508ca5a4bf3a3d7e7267a27d2ecf2e71900776c8f2f269256e-merged.mount: Deactivated successfully. Dec 6 04:42:07 localhost systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'. Dec 6 04:42:07 localhost systemd[1]: Stopped openstack_network_exporter container. Dec 6 04:42:07 localhost systemd[1]: Starting openstack_network_exporter container... Dec 6 04:42:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1508 DF PROTO=TCP SPT=48478 DPT=9101 SEQ=4008022964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB883070000000001030307) Dec 6 04:42:07 localhost rsyslogd[759]: imjournal: 825 messages lost due to rate-limiting (20000 allowed within 600 seconds) Dec 6 04:42:07 localhost systemd[1]: Started libcrun container. Dec 6 04:42:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc5f1dbdee87b2adad7fc0eed4a1ba6cf7b3a27fe7a7bcddaf1fc637f8d71b0/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Dec 6 04:42:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc5f1dbdee87b2adad7fc0eed4a1ba6cf7b3a27fe7a7bcddaf1fc637f8d71b0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff) Dec 6 04:42:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc5f1dbdee87b2adad7fc0eed4a1ba6cf7b3a27fe7a7bcddaf1fc637f8d71b0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff) Dec 6 04:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:42:07 localhost podman[199737]: 2025-12-06 09:42:07.800281996 +0000 UTC m=+0.552780424 container init 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:42:07 localhost systemd[1]: var-lib-containers-storage-overlay-4fbb5bd577b683508ca5a4bf3a3d7e7267a27d2ecf2e71900776c8f2f269256e-merged.mount: Deactivated successfully. Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *bridge.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *coverage.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *datapath.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *iface.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *memory.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *ovnnorthd.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *ovn.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *ovsdbserver.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *pmd_perf.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *pmd_rxq.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: INFO 09:42:07 main.go:48: registering *vswitch.Collector Dec 6 04:42:07 localhost openstack_network_exporter[199751]: NOTICE 09:42:07 main.go:76: listening on https://:9105/metrics Dec 6 04:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:42:07 localhost podman[199737]: 2025-12-06 09:42:07.845922955 +0000 UTC m=+0.598421323 container start 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64) Dec 6 04:42:07 localhost podman[199737]: openstack_network_exporter Dec 6 04:42:07 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:42:08 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:42:08 localhost systemd[1]: Started openstack_network_exporter container. Dec 6 04:42:08 localhost podman[199761]: 2025-12-06 09:42:08.137019479 +0000 UTC m=+0.282797267 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.) Dec 6 04:42:08 localhost podman[199761]: 2025-12-06 09:42:08.154164973 +0000 UTC m=+0.299942801 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 04:42:08 localhost nova_compute[187174]: 2025-12-06 09:42:08.210 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:10 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:42:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:42:10 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:42:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62920 DF PROTO=TCP SPT=43156 DPT=9100 SEQ=541471705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB88E470000000001030307) Dec 6 04:42:10 localhost podman[199798]: 2025-12-06 09:42:10.246574484 +0000 UTC m=+0.202855830 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:42:10 localhost podman[199798]: 2025-12-06 09:42:10.281285524 +0000 UTC m=+0.237566900 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:42:11 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:11 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Dec 6 04:42:11 localhost nova_compute[187174]: 2025-12-06 09:42:11.791 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:12 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:42:12 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:42:12 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:42:12 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:42:13 localhost nova_compute[187174]: 2025-12-06 09:42:13.215 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:13 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:13 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:13 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:14 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:42:14 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:42:14 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:42:14 localhost python3.9[199913]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:42:15 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Dec 6 04:42:15 localhost systemd[1]: var-lib-containers-storage-overlay-e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897-merged.mount: Deactivated successfully. Dec 6 04:42:15 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:15 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:42:15 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:42:15 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:42:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1509 DF PROTO=TCP SPT=48478 DPT=9101 SEQ=4008022964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8A3870000000001030307) Dec 6 04:42:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31045 DF PROTO=TCP SPT=54314 DPT=9105 SEQ=2626082588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8A5850000000001030307) Dec 6 04:42:16 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:16 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:16 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:42:16 localhost nova_compute[187174]: 2025-12-06 09:42:16.794 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:16 localhost nova_compute[187174]: 2025-12-06 09:42:16.806 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:16 localhost nova_compute[187174]: 2025-12-06 09:42:16.807 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:16 localhost nova_compute[187174]: 2025-12-06 09:42:16.807 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:42:16 localhost nova_compute[187174]: 2025-12-06 09:42:16.807 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:42:17 localhost nova_compute[187174]: 2025-12-06 09:42:17.095 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:42:17 localhost nova_compute[187174]: 2025-12-06 09:42:17.095 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:42:17 localhost nova_compute[187174]: 2025-12-06 09:42:17.096 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:42:17 localhost nova_compute[187174]: 2025-12-06 09:42:17.096 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:42:17 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:17 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:42:17 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:42:17 localhost systemd[1]: var-lib-containers-storage-overlay-bdaa822da5273b8b58f454c368ddec1badb6f06929e8c5917413151ec2935f51-merged.mount: Deactivated successfully. Dec 6 04:42:17 localhost systemd[1]: var-lib-containers-storage-overlay-bdaa822da5273b8b58f454c368ddec1badb6f06929e8c5917413151ec2935f51-merged.mount: Deactivated successfully. Dec 6 04:42:18 localhost nova_compute[187174]: 2025-12-06 09:42:18.257 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:18 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:18 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Dec 6 04:42:18 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Dec 6 04:42:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:19 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.235 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.249 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.249 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.249 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.249 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.249 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.250 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.250 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.250 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.250 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.250 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:42:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31047 DF PROTO=TCP SPT=54314 DPT=9105 SEQ=2626082588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8B1880000000001030307) Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.267 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.267 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.267 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.267 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.315 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.353 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.354 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.443 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.444 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:42:19 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.511 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.511 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.563 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.716 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.718 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12904MB free_disk=387.2917594909668GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.719 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.720 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:42:19 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.822 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.823 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.824 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:42:19 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.889 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:42:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.905 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.908 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:42:19 localhost nova_compute[187174]: 2025-12-06 09:42:19.908 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:42:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:20 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Dec 6 04:42:20 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:20 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:20 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:20 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:20 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:20 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:42:21 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Dec 6 04:42:21 localhost systemd[1]: var-lib-containers-storage-overlay-dae190d40250f7df03793838b96be5e7fd6c282a4757e117b727e7855041c6b0-merged.mount: Deactivated successfully. Dec 6 04:42:21 localhost podman[199943]: 2025-12-06 09:42:21.317092734 +0000 UTC m=+0.094026530 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 04:42:21 localhost podman[199943]: 2025-12-06 09:42:21.349574472 +0000 UTC m=+0.126508268 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 04:42:21 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Dec 6 04:42:21 localhost nova_compute[187174]: 2025-12-06 09:42:21.796 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:21 localhost systemd[1]: var-lib-containers-storage-overlay-e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897-merged.mount: Deactivated successfully. Dec 6 04:42:22 localhost systemd[1]: var-lib-containers-storage-overlay-e5151fde04c58d5066164edee33bce9292a01af23ce8d979eb198fe927d25897-merged.mount: Deactivated successfully. Dec 6 04:42:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:22.982 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:42:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:22.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:42:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:22.997 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31064064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:22.998 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '311fe8c9-c4c6-4bda-a406-eda56ac01583', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31064064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:22.983795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddac9bba-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.193951912, 'message_signature': '41456dac8d6434453ac09abdc6b6d77c3b74d0f2ec9759628e770a37e39b1df2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:22.983795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddacaed4-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.193951912, 'message_signature': 'edaad1a88f9d40f4ec8e47513633eb8e6c631be00d488e971321266ceb68f3d4'}]}, 'timestamp': '2025-12-06 09:42:22.998722', '_unique_id': 'c55a18002d984a92960dae3f75d2270f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.005 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '919ab880-1b21-4d03-a521-457e6de76936', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.001662', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddadc044-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '74adbdac1cff7c46dc0d70916a4ac5fad045f16e7748c02c0b50b8b41fc6d697'}]}, 'timestamp': '2025-12-06 09:42:23.005746', '_unique_id': '57cef487e1044427b3dfe14b882783c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.006 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.043 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.044 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b64f1923-6bf5-4d11-a34a-3ed3d55f84e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.008072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddb3acde-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '44f92e93861c8febc34c543d14904bece5066de0953712d844674058ab9cd0d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.008072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddb3c048-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '78720e61120763f84acdc35855d71ac9be6a39607dfaa638e47f2a30ce99b3fd'}]}, 'timestamp': '2025-12-06 09:42:23.045085', '_unique_id': '43f89326cf694b9590a312ac977f6e59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.046 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.048 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 9699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '306d1011-4577-44e6-861a-11e1acdd4006', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9699, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.048031', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddb4475c-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '41efce1b086c2bd79aa8e7ff135947a1a9435df0c432a664f1f31de1c585b8b0'}]}, 'timestamp': '2025-12-06 09:42:23.048518', '_unique_id': 'c89754520b444596a1bb655ecfdf806b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.050 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.073 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7db33e9-88f3-4411-878e-3729c946262a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:42:23.050731', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ddb83ae2-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.283859614, 'message_signature': '60842fb1066a3c6d924fd9035edc901e22f611b6f812ba39aaf9c88a6cf70a34'}]}, 'timestamp': '2025-12-06 09:42:23.074410', '_unique_id': 'cac9d1e71c3c4ad8844948b7593dcf80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.077 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c050b37-f76f-44a2-9064-3a7178e2d143', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.077065', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddb8b5b2-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '4c8ba570f93c7ccfaf6da0421b2f421801656e1c7618c3930effbe7363a3da34'}]}, 'timestamp': '2025-12-06 09:42:23.077717', '_unique_id': '650a58430a1c43a7af8a3ee3256dbf07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00c7ab9b-807a-44ff-bba2-75df797db353', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.080011', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddb92808-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': 'ab450877655edd977ad02e68cac3de0de5601a1358f145524453c549707c432a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.080011', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddb9388e-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '94354962f10b5eb2c9cc27b49d1b2bf3f16503666fbb3f3920cd35716c035f67'}]}, 'timestamp': '2025-12-06 09:42:23.080904', '_unique_id': 'eaac2fc5c8294dc68334b8c0fcec1069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95e8214d-ad97-4124-b36b-d5f7f4d1d9a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.083169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddb9a47c-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.193951912, 'message_signature': '90fa576e0d4338abe17f67bad744bf831805a6c5d937911973671b555c5e475b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.083169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddb9b5f2-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.193951912, 'message_signature': 'a0c12da6662ebd8e018318e467a3bb1e5806a1a1a0813009a55496603c44ed55'}]}, 'timestamp': '2025-12-06 09:42:23.084085', '_unique_id': 'c98a038cf0df482284e744de71765bd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.086 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f39851a-24b2-483c-855a-f2d2f42c3cfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.086338', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddba1f42-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': 'cc3e924720abfeb1997bfd0287b52df6c54b70d1c7ec4a909d0ad24c5399c4e3'}]}, 'timestamp': '2025-12-06 09:42:23.086807', '_unique_id': 'e8affdd7e6ba440c8aa7c49e2eeab3b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 947163713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 9516486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8169bd2e-add2-4db1-b3ea-78aba11fd41f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 947163713, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.089000', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddba8702-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': 'bc6e9dee0373874fe95ab73bac0eff251e107e7c4a5aa9f95cb94c0c2fa694ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9516486, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.089000', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddba976a-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '8abc127d52e17819d445238421e52744abd55f85d7d3bef6d6554f2ae9765917'}]}, 'timestamp': '2025-12-06 09:42:23.089887', '_unique_id': '732cd29db2a94cc5865902a3ff56ea22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.092 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '390f238f-dbd3-46bd-b8ad-1f3cde33835f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.092098', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddbb0056-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '899d369b27f04844a5366d318beef9aeb98b936b98604e61ccd2e4d42a65792b'}]}, 'timestamp': '2025-12-06 09:42:23.092576', '_unique_id': '2e3a134be5834fce9bfad3fa9c70d3f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.094 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.095 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b993994d-f118-4d9d-965f-e3975aa0067d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.094948', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddbb6faa-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.193951912, 'message_signature': '87a77645c24b8c24d7343cb7764272b006132834504e492a89f05c02b4586acb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.094948', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddbb80b2-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.193951912, 'message_signature': 'eb5d49d529e492127facc9af987ff38d2622df2e8d9ca7692b6c49e76a7cf9f1'}]}, 'timestamp': '2025-12-06 09:42:23.095825', '_unique_id': '62dc796ff15849ea851d6db39376cf18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.098 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77c79216-e79b-49c9-b8b7-d5cb813dd952', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.098030', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddbbe85e-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '6f198a096ad240bda0ed28560c835ebfa825442d305b356dbbbf0fd6dd632236'}]}, 'timestamp': '2025-12-06 09:42:23.098511', '_unique_id': '47aaa90b5edc496b8e9c96f2522f8568'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f32895e-d80d-42ab-9b76-6012e1441e23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.100684', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddbc51ea-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '5d0fd3feb8ab0e90170f84eeeaddd6d1efaf449cd28e89c3e902f5182701bd0e'}]}, 'timestamp': '2025-12-06 09:42:23.101230', '_unique_id': 'c82c88cd49de487e97bd6e7e6cb16f31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 73904128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45219974-01d9-4c20-97dd-b7278db44383', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73904128, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.103944', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddbccf76-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': 'f5997a68b85f30e39e3dad682d28102907461cb29c0f2b7c1423aff89413f53c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.103944', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddbce0a6-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '93aea652e837bb8cfe9d48302ac240d6530f6c86c74795535eb12a4af1b89de7'}]}, 'timestamp': '2025-12-06 09:42:23.104882', '_unique_id': '36a7701f7a70491289345b52b555f8b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.105 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.107 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e58bc1d-2720-498e-9bac-14f4263cd992', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.107230', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddbd50f4-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '6ea751d3c87ead50720e069a99e349976ae3f9b101659fec60f4a12b1fa8d38c'}]}, 'timestamp': '2025-12-06 09:42:23.107758', '_unique_id': '2072f3d6126c488c9488775d8b449349'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.110 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.110 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c13d7c86-8e69-47ea-bd8d-f27e3e8c179a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.110037', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddbdbd82-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': 'b51b3ea753c2e59969fa12444095a599275607b04fbd7b16ae7637868d2ce31d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.110037', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddbdd11e-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '5ef797742bb6c7916ccde3a2292a7232d84bb8742b512974d961e9abe80e56f4'}]}, 'timestamp': '2025-12-06 09:42:23.111160', '_unique_id': 'ab606fe793214b5e96501bb9ee8ce1fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 96 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6f369ce-b2d8-40a1-b456-0d78e8a6d886', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 96, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.113540', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddbe44e6-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '02db77b93367dd9cae22beea67c43483d8084ad4698c7cade7b461da796e3e0e'}]}, 'timestamp': '2025-12-06 09:42:23.113976', '_unique_id': 'de893b7fe18a45cf83c3ea07b6f5927b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 301237008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 37411248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57ccaaab-7ce3-4037-916f-474b39cff2bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 301237008, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:42:23.115445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ddbe8ce4-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '883d2e5ef920a003e3e02d17cc4170ab5ddebd414905cb9ec390241e9f37e254'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37411248, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:42:23.115445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ddbe97de-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.218181366, 'message_signature': '5990815c8a2b3848ba926d00bdfbc4708a9540cb8acc49e8f3725dea5ccf3a1c'}]}, 'timestamp': '2025-12-06 09:42:23.116004', '_unique_id': 'd4c4ecb782934f63b6f6ed0fa5213d74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 51460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13cd42dc-b602-486f-ae9a-2447d83e0f44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51460000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:42:23.117341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ddbed6c2-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.283859614, 'message_signature': 'e3f079ad292b1ab8e1770683ce03538da6eb7f09db49837a5a764da6a170b6bb'}]}, 'timestamp': '2025-12-06 09:42:23.117621', '_unique_id': '1a1c384b423b43d0a6a023e020544a2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 10762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c08e6b84-63bb-46d7-bcfe-32443111c8eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10762, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:42:23.118960', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'ddbf15f6-d287-11f0-8fed-fa163edf398d', 'monotonic_time': 10530.21179964, 'message_signature': '3ffc2fd52e7426afe9b444bb2fd79bbf264f5f9f223c9393c16cb52b47b2bd3f'}]}, 'timestamp': '2025-12-06 09:42:23.119250', '_unique_id': '2f10ca9637694decb61837b5cff49a06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:42:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:42:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:42:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31048 DF PROTO=TCP SPT=54314 DPT=9105 SEQ=2626082588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8C1470000000001030307) Dec 6 04:42:23 localhost nova_compute[187174]: 2025-12-06 09:42:23.296 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:23 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:23 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:23 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:23 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:42:24 localhost podman[199968]: 2025-12-06 09:42:24.314023787 +0000 UTC m=+0.094922067 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS) Dec 6 04:42:24 localhost podman[199968]: 2025-12-06 09:42:24.324474558 +0000 UTC m=+0.105372818 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:42:24 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:24 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:42:24 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:42:25 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:42:25 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:25 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:25 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:42:25 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:25 localhost podman[199988]: 2025-12-06 09:42:25.808263923 +0000 UTC m=+0.408059227 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:42:25 localhost podman[199988]: 2025-12-06 09:42:25.816120855 +0000 UTC m=+0.415916239 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:42:25 localhost podman[199988]: unhealthy Dec 6 04:42:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39840 DF PROTO=TCP SPT=52934 DPT=9102 SEQ=868325356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8CB880000000001030307) Dec 6 04:42:26 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:26 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:42:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:42:26 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:26 localhost nova_compute[187174]: 2025-12-06 09:42:26.800 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:26 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:26 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:42:26 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Failed with result 'exit-code'. Dec 6 04:42:26 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:26 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:26 localhost podman[200008]: 2025-12-06 09:42:26.920305508 +0000 UTC m=+0.216824132 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 04:42:26 localhost podman[200009]: 2025-12-06 09:42:26.987005887 +0000 UTC m=+0.277219228 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true) Dec 6 04:42:27 localhost podman[200008]: 2025-12-06 09:42:27.003774212 +0000 UTC m=+0.300292866 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:42:27 localhost podman[200009]: 2025-12-06 09:42:27.019513876 +0000 UTC m=+0.309727237 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:42:27 localhost podman[200009]: unhealthy Dec 6 04:42:27 localhost podman[197801]: time="2025-12-06T09:42:27Z" level=error msg="Getting root fs size for \"646e969fff8ba85a8249066976244d842392d9cd17bd1985b1a02ecb100e1d5e\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy" Dec 6 04:42:27 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:27 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:42:27 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:42:27 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:42:28 localhost nova_compute[187174]: 2025-12-06 09:42:28.329 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:29 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:29 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Dec 6 04:42:29 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Dec 6 04:42:29 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:29 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:30 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13641 DF PROTO=TCP SPT=45846 DPT=9101 SEQ=1051330753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8DC570000000001030307) Dec 6 04:42:30 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:30 localhost systemd[1]: var-lib-containers-storage-overlay-03eafd0c985fd00ac0465379e7194dd52dd90ec7c0ceb023e7be1171171dc496-merged.mount: Deactivated successfully. Dec 6 04:42:30 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13642 DF PROTO=TCP SPT=45846 DPT=9101 SEQ=1051330753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8E0470000000001030307) Dec 6 04:42:31 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Dec 6 04:42:31 localhost systemd[1]: var-lib-containers-storage-overlay-dae190d40250f7df03793838b96be5e7fd6c282a4757e117b727e7855041c6b0-merged.mount: Deactivated successfully. Dec 6 04:42:31 localhost nova_compute[187174]: 2025-12-06 09:42:31.804 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:33 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:33 localhost nova_compute[187174]: 2025-12-06 09:42:33.367 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:33 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:35 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62922 DF PROTO=TCP SPT=43156 DPT=9100 SEQ=541471705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8EF870000000001030307) Dec 6 04:42:35 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:36 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:36 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:36 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:36 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:36 localhost sshd[200047]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:42:36 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:36 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:36 localhost nova_compute[187174]: 2025-12-06 09:42:36.809 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:37 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13644 DF PROTO=TCP SPT=45846 DPT=9101 SEQ=1051330753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB8F8070000000001030307) Dec 6 04:42:37 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:38 localhost nova_compute[187174]: 2025-12-06 09:42:38.387 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:38 localhost systemd[1]: var-lib-containers-storage-overlay-cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0-merged.mount: Deactivated successfully. Dec 6 04:42:39 localhost systemd[1]: var-lib-containers-storage-overlay-03eafd0c985fd00ac0465379e7194dd52dd90ec7c0ceb023e7be1171171dc496-merged.mount: Deactivated successfully. Dec 6 04:42:39 localhost sshd[200049]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:42:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52538 DF PROTO=TCP SPT=57890 DPT=9100 SEQ=3257095065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB903480000000001030307) Dec 6 04:42:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:42:40 localhost podman[200051]: 2025-12-06 09:42:40.570630757 +0000 UTC m=+0.083242149 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 04:42:40 localhost podman[200051]: 2025-12-06 09:42:40.651212533 +0000 UTC m=+0.163823875 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 04:42:40 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:42:41 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:42:41 localhost nova_compute[187174]: 2025-12-06 09:42:41.813 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:42 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:42 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:42 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:42 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:42:43 localhost systemd[1]: tmp-crun.tVtK2s.mount: Deactivated successfully. Dec 6 04:42:43 localhost podman[200072]: 2025-12-06 09:42:43.03280165 +0000 UTC m=+0.064861863 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:42:43 localhost podman[200072]: 2025-12-06 09:42:43.11122051 +0000 UTC m=+0.143280713 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:42:43 localhost nova_compute[187174]: 2025-12-06 09:42:43.431 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:44 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:44 localhost podman[197801]: time="2025-12-06T09:42:44Z" level=error msg="Getting root fs size for \"6cd5a46fb62cf40368ef7e261dafb4763a25f4b93283502086109aa6eaed2da7\": getting diffsize of layer \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\" and its parent \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy" Dec 6 04:42:44 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:44 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:42:44 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:42:44 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:45 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:42:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13645 DF PROTO=TCP SPT=45846 DPT=9101 SEQ=1051330753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB917870000000001030307) Dec 6 04:42:45 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:45 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:45 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55401 DF PROTO=TCP SPT=41266 DPT=9105 SEQ=3561861564 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB91AB60000000001030307) Dec 6 04:42:46 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:46 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:46 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:46 localhost nova_compute[187174]: 2025-12-06 09:42:46.816 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:47 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:42:47 localhost systemd[1]: var-lib-containers-storage-overlay-cc845f2ee7a7ff971f466641845e17e836d3ed05d5e871254c9077d17f4dbfb0-merged.mount: Deactivated successfully. Dec 6 04:42:48 localhost nova_compute[187174]: 2025-12-06 09:42:48.465 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:48 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:42:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55403 DF PROTO=TCP SPT=41266 DPT=9105 SEQ=3561861564 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB926C70000000001030307) Dec 6 04:42:50 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:50 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:42:50 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:42:51 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:42:51 localhost nova_compute[187174]: 2025-12-06 09:42:51.820 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:51 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:42:52 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:42:52 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:52 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:53 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:42:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19171 DF PROTO=TCP SPT=35130 DPT=9882 SEQ=2879590644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9367E0000000001030307) Dec 6 04:42:53 localhost nova_compute[187174]: 2025-12-06 09:42:53.493 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:42:53 localhost podman[200093]: 2025-12-06 09:42:53.785288941 +0000 UTC m=+0.074010914 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 04:42:53 localhost podman[200093]: 2025-12-06 09:42:53.848126511 +0000 UTC m=+0.136848464 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:42:53 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:54 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:54 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:54 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:42:54 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:42:54 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:54 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:55 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:42:55 localhost podman[197801]: time="2025-12-06T09:42:55Z" level=error msg="Getting root fs size for \"8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy" Dec 6 04:42:55 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:55 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:42:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3633 DF PROTO=TCP SPT=38164 DPT=9102 SEQ=1095399628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB93F870000000001030307) Dec 6 04:42:55 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:42:56 localhost systemd[1]: tmp-crun.snjAa3.mount: Deactivated successfully. Dec 6 04:42:56 localhost podman[200117]: 2025-12-06 09:42:56.154886961 +0000 UTC m=+0.191672540 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 04:42:56 localhost podman[200117]: 2025-12-06 09:42:56.164672811 +0000 UTC m=+0.201458360 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:42:56 localhost nova_compute[187174]: 2025-12-06 09:42:56.824 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:57 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:42:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:42:57 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:42:57 localhost systemd[1]: var-lib-containers-storage-overlay-79368befea3ff2e81adcb0e0c13630267794aa8ed0c8a0948014dcb8ce0c10ed-merged.mount: Deactivated successfully. Dec 6 04:42:57 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:42:57 localhost podman[200136]: 2025-12-06 09:42:57.594575732 +0000 UTC m=+0.231266147 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:42:57 localhost podman[200136]: 2025-12-06 09:42:57.601339589 +0000 UTC m=+0.238030014 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:42:57 localhost podman[200136]: unhealthy Dec 6 04:42:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:42:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:42:58 localhost nova_compute[187174]: 2025-12-06 09:42:58.542 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:42:59 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:42:59 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:42:59 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:00 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:43:00 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Failed with result 'exit-code'. Dec 6 04:43:00 localhost podman[197801]: time="2025-12-06T09:43:00Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged: invalid argument" Dec 6 04:43:00 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:00 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:00 localhost podman[197801]: time="2025-12-06T09:43:00Z" level=error msg="Getting root fs size for \"8b6efb2dedcefdb9e8eb671c1e93eab618330fff0451f53befd0865de6e247b4\": getting diffsize of layer \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\" and its parent \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\": creating overlay mount to /var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/JIM3S5SH6VMTD2C3EPGVC6HA2S:/var/lib/containers/storage/overlay/l/MJEKI3IZCSDR5TOD2EKXV2PFCH:/var/lib/containers/storage/overlay/l/CZL7EU6WJ52LL66WWRTIXRNAMJ,upperdir=/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/diff,workdir=/var/lib/containers/storage/overlay/f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958/work,nodev,metacopy=on\": no such file or directory" Dec 6 04:43:00 localhost podman[200157]: 2025-12-06 09:43:00.117896384 +0000 UTC m=+2.495203190 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:43:00 localhost podman[200157]: 2025-12-06 09:43:00.12331461 +0000 UTC m=+2.500621486 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:43:00 localhost podman[200158]: 2025-12-06 09:43:00.181483557 +0000 UTC m=+2.553959774 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute) Dec 6 04:43:00 localhost podman[200158]: 2025-12-06 09:43:00.214213783 +0000 UTC m=+2.586690020 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:43:00 localhost podman[200158]: unhealthy Dec 6 04:43:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24675 DF PROTO=TCP SPT=60322 DPT=9101 SEQ=2975566475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB951880000000001030307) Dec 6 04:43:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24676 DF PROTO=TCP SPT=60322 DPT=9101 SEQ=2975566475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB955880000000001030307) Dec 6 04:43:01 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:01 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:01 localhost nova_compute[187174]: 2025-12-06 09:43:01.827 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:02 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:02 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:43:02 localhost systemd[1]: var-lib-containers-storage-overlay-79368befea3ff2e81adcb0e0c13630267794aa8ed0c8a0948014dcb8ce0c10ed-merged.mount: Deactivated successfully. Dec 6 04:43:02 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:43:02 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:43:02 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:43:03 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:03 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:03 localhost nova_compute[187174]: 2025-12-06 09:43:03.582 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:03 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:04 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:04 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1511 DF PROTO=TCP SPT=48478 DPT=9101 SEQ=4008022964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB961870000000001030307) Dec 6 04:43:04 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:43:06.662 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:43:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:43:06.663 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:43:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:43:06.664 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:43:06 localhost nova_compute[187174]: 2025-12-06 09:43:06.830 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:06 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:07 localhost systemd[1]: var-lib-containers-storage-overlay-9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26-merged.mount: Deactivated successfully. Dec 6 04:43:07 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:07 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24678 DF PROTO=TCP SPT=60322 DPT=9101 SEQ=2975566475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB96D470000000001030307) Dec 6 04:43:08 localhost nova_compute[187174]: 2025-12-06 09:43:08.634 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:08 localhost nova_compute[187174]: 2025-12-06 09:43:08.973 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:08 localhost nova_compute[187174]: 2025-12-06 09:43:08.974 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:08 localhost nova_compute[187174]: 2025-12-06 09:43:08.994 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:08 localhost nova_compute[187174]: 2025-12-06 09:43:08.995 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:43:08 localhost nova_compute[187174]: 2025-12-06 09:43:08.995 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:43:09 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:09 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:09 localhost podman[197801]: time="2025-12-06T09:43:09Z" level=error msg="Getting root fs size for \"8d3e04599112b38d3bb396a64c7516c7cdbbc6f9949c0d498f108ac085dc82d3\": getting diffsize of layer \"3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": unmounting layer 3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34: replacing mount point \"/var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/merged\": no such file or directory" Dec 6 04:43:09 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:10 localhost nova_compute[187174]: 2025-12-06 09:43:10.081 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:43:10 localhost nova_compute[187174]: 2025-12-06 09:43:10.082 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:43:10 localhost nova_compute[187174]: 2025-12-06 09:43:10.082 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:43:10 localhost nova_compute[187174]: 2025-12-06 09:43:10.082 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:43:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54375 DF PROTO=TCP SPT=51128 DPT=9100 SEQ=1383299096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB978870000000001030307) Dec 6 04:43:11 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:11 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:11 localhost nova_compute[187174]: 2025-12-06 09:43:11.833 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:11 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:12 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:43:12 localhost podman[200192]: 2025-12-06 09:43:12.551830813 +0000 UTC m=+0.071631923 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible) Dec 6 04:43:12 localhost podman[200192]: 2025-12-06 09:43:12.565493202 +0000 UTC m=+0.085294322 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, version=9.6) Dec 6 04:43:12 localhost systemd[1]: var-lib-containers-storage-overlay-9161727dcb4f67fce7b939133d0105fd4670b62d026b503696c1aa11636dba26-merged.mount: Deactivated successfully. Dec 6 04:43:12 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:12 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:13 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:13 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.284 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.302 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.302 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.303 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.303 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.304 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.304 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.304 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.305 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.305 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.305 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.323 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.323 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.324 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.324 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.378 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.450 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.452 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.505 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.506 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.552 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.553 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.626 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.673 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:13 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.879 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.881 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12876MB free_disk=387.2917900085449GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.887 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.888 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:43:13 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.961 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.962 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:43:13 localhost nova_compute[187174]: 2025-12-06 09:43:13.962 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:43:14 localhost nova_compute[187174]: 2025-12-06 09:43:14.005 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:43:14 localhost nova_compute[187174]: 2025-12-06 09:43:14.019 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:43:14 localhost nova_compute[187174]: 2025-12-06 09:43:14.022 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:43:14 localhost nova_compute[187174]: 2025-12-06 09:43:14.022 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:43:14 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:43:14 localhost podman[200225]: 2025-12-06 09:43:14.809211895 +0000 UTC m=+0.082041522 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:43:14 localhost podman[200225]: 2025-12-06 09:43:14.843189978 +0000 UTC m=+0.116019575 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:43:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24679 DF PROTO=TCP SPT=60322 DPT=9101 SEQ=2975566475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB98D870000000001030307) Dec 6 04:43:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44861 DF PROTO=TCP SPT=34632 DPT=9105 SEQ=630156572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB98FE50000000001030307) Dec 6 04:43:16 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:16 localhost systemd[1]: var-lib-containers-storage-overlay-997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e-merged.mount: Deactivated successfully. Dec 6 04:43:16 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:43:16 localhost nova_compute[187174]: 2025-12-06 09:43:16.836 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:17 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Dec 6 04:43:17 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Dec 6 04:43:18 localhost nova_compute[187174]: 2025-12-06 09:43:18.706 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:18 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:43:18 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:43:18 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:19 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44863 DF PROTO=TCP SPT=34632 DPT=9105 SEQ=630156572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB99C070000000001030307) Dec 6 04:43:19 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:43:20 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:43:20 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:43:20 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:43:20 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:21 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:21 localhost nova_compute[187174]: 2025-12-06 09:43:21.838 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:21 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:21 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:43:22 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:43:22 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:43:22 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:43:22 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:23 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:23 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:23 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30749 DF PROTO=TCP SPT=51754 DPT=9882 SEQ=3839693934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9ABB60000000001030307) Dec 6 04:43:23 localhost nova_compute[187174]: 2025-12-06 09:43:23.736 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:43:24 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:43:24 localhost podman[200250]: 2025-12-06 09:43:24.546088012 +0000 UTC m=+0.083678701 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller) Dec 6 04:43:24 localhost systemd[1]: var-lib-containers-storage-overlay-921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4-merged.mount: Deactivated successfully. Dec 6 04:43:24 localhost podman[200250]: 2025-12-06 09:43:24.610308248 +0000 UTC m=+0.147898967 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:43:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19176 DF PROTO=TCP SPT=35130 DPT=9882 SEQ=2879590644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9B3870000000001030307) Dec 6 04:43:25 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:26 localhost systemd[1]: var-lib-containers-storage-overlay-997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e-merged.mount: Deactivated successfully. Dec 6 04:43:26 localhost systemd[1]: var-lib-containers-storage-overlay-997e2c3550cc2daefd91a60e0708d42688d15d64ba54f50063ff752fcfae8f2e-merged.mount: Deactivated successfully. Dec 6 04:43:26 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:43:26 localhost nova_compute[187174]: 2025-12-06 09:43:26.840 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:26 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Dec 6 04:43:26 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Dec 6 04:43:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:43:28 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:43:28 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:28 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:43:28 localhost podman[200273]: 2025-12-06 09:43:28.551759776 +0000 UTC m=+0.084002761 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 04:43:28 localhost podman[200273]: 2025-12-06 09:43:28.56436541 +0000 UTC m=+0.096608385 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd) Dec 6 04:43:28 localhost nova_compute[187174]: 2025-12-06 09:43:28.783 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:28 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:43:29 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30604 DF PROTO=TCP SPT=39454 DPT=9101 SEQ=4243264164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9C6B80000000001030307) Dec 6 04:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:43:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:43:30 localhost podman[200292]: 2025-12-06 09:43:30.359268887 +0000 UTC m=+0.086253120 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:43:30 localhost podman[200292]: 2025-12-06 09:43:30.368135687 +0000 UTC m=+0.095119980 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:43:30 localhost podman[200292]: unhealthy Dec 6 04:43:30 localhost systemd[1]: tmp-crun.RXaHJ5.mount: Deactivated successfully. Dec 6 04:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully. Dec 6 04:43:30 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:30 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:43:30 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Failed with result 'exit-code'. Dec 6 04:43:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30605 DF PROTO=TCP SPT=39454 DPT=9101 SEQ=4243264164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9CAC70000000001030307) Dec 6 04:43:31 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:43:31 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:43:31 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:43:31 localhost nova_compute[187174]: 2025-12-06 09:43:31.841 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:32 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:32 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:43:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:43:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:43:32 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:43:32 localhost podman[200312]: 2025-12-06 09:43:32.801993964 +0000 UTC m=+0.067193539 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:43:32 localhost podman[200313]: 2025-12-06 09:43:32.833504964 +0000 UTC m=+0.099111972 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:43:32 localhost podman[200313]: 2025-12-06 09:43:32.866200851 +0000 UTC m=+0.131807899 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 04:43:32 localhost podman[200313]: unhealthy Dec 6 04:43:32 localhost podman[200312]: 2025-12-06 09:43:32.888137628 +0000 UTC m=+0.153337243 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 04:43:33 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:33 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:33 localhost nova_compute[187174]: 2025-12-06 09:43:33.822 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:34 localhost systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully. Dec 6 04:43:34 localhost systemd[1]: var-lib-containers-storage-overlay-921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4-merged.mount: Deactivated successfully. Dec 6 04:43:34 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:43:34 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:43:34 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:43:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54377 DF PROTO=TCP SPT=51128 DPT=9100 SEQ=1383299096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9D9880000000001030307) Dec 6 04:43:35 localhost systemd[1]: session-41.scope: Deactivated successfully. Dec 6 04:43:35 localhost systemd[1]: session-41.scope: Consumed 1min 2.398s CPU time. Dec 6 04:43:35 localhost systemd-logind[760]: Session 41 logged out. Waiting for processes to exit. Dec 6 04:43:35 localhost systemd-logind[760]: Removed session 41. Dec 6 04:43:35 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:35 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:35 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:36 localhost nova_compute[187174]: 2025-12-06 09:43:36.844 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30607 DF PROTO=TCP SPT=39454 DPT=9101 SEQ=4243264164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9E2880000000001030307) Dec 6 04:43:37 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:37 localhost systemd[1]: var-lib-containers-storage-overlay-d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e-merged.mount: Deactivated successfully. Dec 6 04:43:37 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:37 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:37 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:37 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:38 localhost nova_compute[187174]: 2025-12-06 09:43:38.877 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48545 DF PROTO=TCP SPT=55298 DPT=9100 SEQ=1308209848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DB9EDC80000000001030307) Dec 6 04:43:40 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:40 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:40 localhost podman[197801]: time="2025-12-06T09:43:40Z" level=error msg="Getting root fs size for \"aa215b21825aef9b5aeeeca748da6a7fe9cf4f2dbb76a43cd859e56922330a3b\": getting diffsize of layer \"3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": unmounting layer 3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34: replacing mount point \"/var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/merged\": no such file or directory" Dec 6 04:43:40 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:40 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:40 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:41 localhost nova_compute[187174]: 2025-12-06 09:43:41.849 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:42 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:42 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:42 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:43:43 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:43 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:43 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:43 localhost podman[200347]: 2025-12-06 09:43:43.294462221 +0000 UTC m=+0.080292488 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public) Dec 6 04:43:43 localhost podman[200347]: 2025-12-06 09:43:43.308305633 +0000 UTC m=+0.094135980 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Dec 6 04:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-d8fe62f5071a990292beb53ff72998daf48ca62b4a6fc97fe2b3e5d151c0e41e-merged.mount: Deactivated successfully. Dec 6 04:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:43 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:43 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:43:43 localhost nova_compute[187174]: 2025-12-06 09:43:43.918 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:44 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:44 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:44 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30608 DF PROTO=TCP SPT=39454 DPT=9101 SEQ=4243264164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA03870000000001030307) Dec 6 04:43:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59865 DF PROTO=TCP SPT=51132 DPT=9105 SEQ=3576421462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA05170000000001030307) Dec 6 04:43:46 localhost nova_compute[187174]: 2025-12-06 09:43:46.852 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:47 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:43:47 localhost systemd[1]: var-lib-containers-storage-overlay-5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb-merged.mount: Deactivated successfully. Dec 6 04:43:47 localhost podman[200370]: 2025-12-06 09:43:47.491578409 +0000 UTC m=+0.162105821 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:43:47 localhost podman[200370]: 2025-12-06 09:43:47.501536743 +0000 UTC m=+0.172064125 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:43:47 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:43:48 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:43:48 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:43:48 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:43:48 localhost nova_compute[187174]: 2025-12-06 09:43:48.968 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59867 DF PROTO=TCP SPT=51132 DPT=9105 SEQ=3576421462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA11080000000001030307) Dec 6 04:43:49 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:49 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Dec 6 04:43:50 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Dec 6 04:43:50 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:50 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:50 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Dec 6 04:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Dec 6 04:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:51 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:51 localhost nova_compute[187174]: 2025-12-06 09:43:51.854 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:52 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:52 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:43:52 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Dec 6 04:43:52 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:52 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59868 DF PROTO=TCP SPT=51132 DPT=9105 SEQ=3576421462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA20C70000000001030307) Dec 6 04:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-aaa997a77e6beff130e4c220991eebd0fba637a32e5b28b914f0fc1adecb1c90-merged.mount: Deactivated successfully. Dec 6 04:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:53 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:53 localhost nova_compute[187174]: 2025-12-06 09:43:53.995 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:54 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:43:54 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Dec 6 04:43:54 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:54 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:55 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Dec 6 04:43:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27264 DF PROTO=TCP SPT=57038 DPT=9102 SEQ=1173704187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA2B870000000001030307) Dec 6 04:43:56 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:56 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:43:56 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:43:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:43:56 localhost podman[200391]: 2025-12-06 09:43:56.401090029 +0000 UTC m=+0.088025794 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 04:43:56 localhost podman[200391]: 2025-12-06 09:43:56.436957442 +0000 UTC m=+0.123893217 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:43:56 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:43:56 localhost nova_compute[187174]: 2025-12-06 09:43:56.856 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:57 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:43:57 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:57 localhost systemd[1]: var-lib-containers-storage-overlay-5a9b5e8002093793c8fb3c19ee661b8475f5d6d1fe6e543df8be7ee8dc3553fb-merged.mount: Deactivated successfully. Dec 6 04:43:57 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:43:57 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:43:58 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:43:58 localhost systemd[1]: var-lib-containers-storage-overlay-a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc-merged.mount: Deactivated successfully. Dec 6 04:43:58 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:43:58 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:58 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:43:58 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:43:58 localhost nova_compute[187174]: 2025-12-06 09:43:58.996 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:43:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:43:59 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:43:59 localhost podman[200414]: 2025-12-06 09:43:59.348072143 +0000 UTC m=+0.134305934 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:43:59 localhost podman[200414]: 2025-12-06 09:43:59.362229074 +0000 UTC m=+0.148462845 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 04:44:00 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:44:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1930 DF PROTO=TCP SPT=46902 DPT=9101 SEQ=1637980371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA3BE80000000001030307) Dec 6 04:44:00 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Dec 6 04:44:00 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Dec 6 04:44:00 localhost systemd[1]: var-lib-containers-storage-overlay-bedaabc8d1e6b69e205ba04f54934e85f233d1071cfe4f6fa7419a243a5303a2-merged.mount: Deactivated successfully. Dec 6 04:44:00 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Dec 6 04:44:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:44:00 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:01 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Dec 6 04:44:01 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:01 localhost podman[200431]: 2025-12-06 09:44:01.085029453 +0000 UTC m=+0.079689870 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:44:01 localhost podman[200431]: 2025-12-06 09:44:01.119241655 +0000 UTC m=+0.113902022 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:44:01 localhost podman[200431]: unhealthy Dec 6 04:44:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1931 DF PROTO=TCP SPT=46902 DPT=9101 SEQ=1637980371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA40080000000001030307) Dec 6 04:44:01 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Dec 6 04:44:01 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:01 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:01 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:01 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:44:01 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Failed with result 'exit-code'. Dec 6 04:44:01 localhost nova_compute[187174]: 2025-12-06 09:44:01.860 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:02 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:02 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:02 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Dec 6 04:44:03 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Dec 6 04:44:03 localhost systemd[1]: var-lib-containers-storage-overlay-aaa997a77e6beff130e4c220991eebd0fba637a32e5b28b914f0fc1adecb1c90-merged.mount: Deactivated successfully. Dec 6 04:44:03 localhost systemd[1]: var-lib-containers-storage-overlay-21122d1ceef7fa397145a1e2df0de098a59ef7bc976e6dd7526001fbdedc477d-merged.mount: Deactivated successfully. Dec 6 04:44:04 localhost nova_compute[187174]: 2025-12-06 09:44:04.040 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24681 DF PROTO=TCP SPT=60322 DPT=9101 SEQ=2975566475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA4B870000000001030307) Dec 6 04:44:04 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:44:04 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Dec 6 04:44:04 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Dec 6 04:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:44:04 localhost nova_compute[187174]: 2025-12-06 09:44:04.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:04 localhost nova_compute[187174]: 2025-12-06 09:44:04.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 04:44:04 localhost nova_compute[187174]: 2025-12-06 09:44:04.908 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 04:44:04 localhost nova_compute[187174]: 2025-12-06 09:44:04.909 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:04 localhost nova_compute[187174]: 2025-12-06 09:44:04.909 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 04:44:04 localhost nova_compute[187174]: 2025-12-06 09:44:04.925 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:04 localhost podman[200453]: 2025-12-06 09:44:04.968121943 +0000 UTC m=+0.089259812 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 04:44:05 localhost podman[200454]: 2025-12-06 09:44:05.026057327 +0000 UTC m=+0.137336726 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:44:05 localhost podman[200454]: 2025-12-06 09:44:05.032163234 +0000 UTC m=+0.143442613 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:44:05 localhost podman[200453]: 2025-12-06 09:44:05.047918484 +0000 UTC m=+0.169056373 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:44:05 localhost podman[200453]: unhealthy Dec 6 04:44:05 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:05 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:44:05 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:05 localhost nova_compute[187174]: 2025-12-06 09:44:05.938 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:05 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:44:06 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:44:06 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:44:06 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:44:06 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:06 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:44:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:44:06.663 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:44:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:44:06.663 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:44:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:44:06.664 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:44:06 localhost nova_compute[187174]: 2025-12-06 09:44:06.863 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:07 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:07 localhost podman[197801]: time="2025-12-06T09:44:07Z" level=error msg="Getting root fs size for \"b7ed8ec1275caffce048187356a5cc8a4583c3ab2c28265d13c8ca9402de3fdc\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy" Dec 6 04:44:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54378 DF PROTO=TCP SPT=51128 DPT=9100 SEQ=1383299096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA57870000000001030307) Dec 6 04:44:07 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:07 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:07 localhost nova_compute[187174]: 2025-12-06 09:44:07.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:07 localhost nova_compute[187174]: 2025-12-06 09:44:07.876 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:44:07 localhost nova_compute[187174]: 2025-12-06 09:44:07.876 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:44:08 localhost nova_compute[187174]: 2025-12-06 09:44:08.049 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:44:08 localhost nova_compute[187174]: 2025-12-06 09:44:08.050 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:44:08 localhost nova_compute[187174]: 2025-12-06 09:44:08.050 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:44:08 localhost nova_compute[187174]: 2025-12-06 09:44:08.050 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:44:08 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:08 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:08 localhost systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully. Dec 6 04:44:08 localhost systemd[1]: var-lib-containers-storage-overlay-bedaabc8d1e6b69e205ba04f54934e85f233d1071cfe4f6fa7419a243a5303a2-merged.mount: Deactivated successfully. Dec 6 04:44:09 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.106 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:09 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.240 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.260 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.260 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.261 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.261 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.262 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.262 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.262 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.262 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.263 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.282 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.282 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.282 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.283 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.351 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.424 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.426 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.497 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.498 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.552 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.552 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.634 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:44:09 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.817 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.819 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12838MB free_disk=387.3035469055176GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.820 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.821 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.940 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.941 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:44:09 localhost nova_compute[187174]: 2025-12-06 09:44:09.941 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.001 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 04:44:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:10 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.071 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.072 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.091 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.119 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 04:44:10 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:10 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.164 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.179 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.182 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:44:10 localhost nova_compute[187174]: 2025-12-06 09:44:10.182 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:44:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12973 DF PROTO=TCP SPT=60964 DPT=9100 SEQ=1771890542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA63070000000001030307) Dec 6 04:44:10 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:11 localhost systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully. Dec 6 04:44:11 localhost systemd[1]: var-lib-containers-storage-overlay-21122d1ceef7fa397145a1e2df0de098a59ef7bc976e6dd7526001fbdedc477d-merged.mount: Deactivated successfully. Dec 6 04:44:11 localhost systemd[1]: var-lib-containers-storage-overlay-21122d1ceef7fa397145a1e2df0de098a59ef7bc976e6dd7526001fbdedc477d-merged.mount: Deactivated successfully. Dec 6 04:44:11 localhost nova_compute[187174]: 2025-12-06 09:44:11.795 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:44:11 localhost nova_compute[187174]: 2025-12-06 09:44:11.865 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:12 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:44:12 localhost systemd[1]: var-lib-containers-storage-overlay-a9befc48c332654d0358552aae4c030fd23abbcc85f80b32354d85e375439ce4-merged.mount: Deactivated successfully. Dec 6 04:44:12 localhost systemd[1]: var-lib-containers-storage-overlay-a9befc48c332654d0358552aae4c030fd23abbcc85f80b32354d85e375439ce4-merged.mount: Deactivated successfully. Dec 6 04:44:14 localhost nova_compute[187174]: 2025-12-06 09:44:14.143 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:14 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:44:14 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:44:14 localhost podman[200500]: 2025-12-06 09:44:14.370722117 +0000 UTC m=+0.085436314 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm) Dec 6 04:44:14 localhost podman[200500]: 2025-12-06 09:44:14.386258661 +0000 UTC m=+0.100972808 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 04:44:15 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:44:15 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:44:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1934 DF PROTO=TCP SPT=46902 DPT=9101 SEQ=1637980371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA77870000000001030307) Dec 6 04:44:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58467 DF PROTO=TCP SPT=53406 DPT=9105 SEQ=1120045713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA7A450000000001030307) Dec 6 04:44:16 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:16 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:16 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:16 localhost nova_compute[187174]: 2025-12-06 09:44:16.868 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:17 localhost podman[200520]: 2025-12-06 09:44:17.840271985 +0000 UTC m=+0.121967348 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:44:17 localhost podman[200520]: 2025-12-06 09:44:17.846626529 +0000 UTC m=+0.128321852 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:18 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:18 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:44:18 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:18 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:18 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:18 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:18 localhost podman[197801]: time="2025-12-06T09:44:18Z" level=error msg="Getting root fs size for \"cf88c6ef813ac2226e9de91004301b374dfc5d4cf3b9e733737ef145b91fece8\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": no such file or directory" Dec 6 04:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:18 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:19 localhost nova_compute[187174]: 2025-12-06 09:44:19.176 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58469 DF PROTO=TCP SPT=53406 DPT=9105 SEQ=1120045713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA86480000000001030307) Dec 6 04:44:19 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:20 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 6 04:44:20 localhost systemd[1]: var-lib-containers-storage-overlay-a9befc48c332654d0358552aae4c030fd23abbcc85f80b32354d85e375439ce4-merged.mount: Deactivated successfully. Dec 6 04:44:20 localhost systemd[1]: var-lib-containers-storage-overlay-a9befc48c332654d0358552aae4c030fd23abbcc85f80b32354d85e375439ce4-merged.mount: Deactivated successfully. Dec 6 04:44:21 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:44:21 localhost systemd[1]: var-lib-containers-storage-overlay-d969762fb0332c5c36abf270d6236af27b60ab1864e2e74a696d11379e3dcdcb-merged.mount: Deactivated successfully. Dec 6 04:44:21 localhost systemd[1]: var-lib-containers-storage-overlay-d969762fb0332c5c36abf270d6236af27b60ab1864e2e74a696d11379e3dcdcb-merged.mount: Deactivated successfully. Dec 6 04:44:21 localhost nova_compute[187174]: 2025-12-06 09:44:21.870 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.981 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.982 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.985 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 9699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed2d7a11-74e0-4aef-82e0-74c1237db1d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9699, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:22.982543', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '25314490-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': '544355c1258ca03f850705bb1d29f42a50e99167e39a7b1b3bf91fe71cb4874c'}]}, 'timestamp': '2025-12-06 09:44:22.985791', '_unique_id': '7fc12877aa4a4df5a5e0d2706e4f70a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.986 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.987 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ba0324f-ef44-40b4-ac59-0973f39b3829', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:22.987324', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '25318ad6-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': '25db0fc105afc4bc259c0c5162998a0579dbd2d7dc6e887562265af27fee6bdc'}]}, 'timestamp': '2025-12-06 09:44:22.987551', '_unique_id': '9a7fa3288ead42a7b534e7547818cd4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.988 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77bba0c3-70a3-4bd1-bac2-19dae5f7895d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:22.988699', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2531c05a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': 'dce4a30132a0b88e7b1d61f4877c07b5010ec022202f0417e927cce52deb5940'}]}, 'timestamp': '2025-12-06 09:44:22.988936', '_unique_id': '7664f807aae64151b49eb0958c015fce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:22.989 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.031 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.031 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00760b81-b92c-4f19-ae23-2c5b96a9ed4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:22.989942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '253844f2-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': 'b45fe6a38402330ff939175a1a0fb17f91f600727ee659c7aa83cbc9ab4f6731'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:22.989942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2538501e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': '11e8f0d7e1dd2ffa179e4c2e21aede207314a52185ba885c98d2c6e6eb1507ac'}]}, 'timestamp': '2025-12-06 09:44:23.031964', '_unique_id': 'd59db509e00b4b28b23b4c57d0fbc02d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.032 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.033 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20249610-235f-4fea-8b9e-105c151ce80b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:23.033510', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '25389894-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': '23c3af091eaefa698926e4cde3f8726afbacefeff4fca1e7f4587145b2c16429'}]}, 'timestamp': '2025-12-06 09:44:23.033815', '_unique_id': 'a704031beddb4a61a09bee7f29f62acf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.034 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.036 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.052 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31064064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.053 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cf05af8-3a14-44c8-8da1-f308657c9fee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31064064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.036370', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '253b8a90-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.246461942, 'message_signature': 'b85fbd490425996e10d5834e20d6909cc45d1632d49e328577fd070e3eac3dbe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.036370', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '253b95d0-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.246461942, 'message_signature': 'c055f2e063ea9256d7a631079fb1cee71b5e140228632185a02773df8a684dbb'}]}, 'timestamp': '2025-12-06 09:44:23.053411', '_unique_id': 'd97629de292149c88635c93f3837e58f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.054 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.055 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f8cd7ad-b460-403a-9cc5-4129a0d492f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.054919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '253bdc98-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.246461942, 'message_signature': '7cc00bf9d6b9ed5f56f68e8698adac022d8d2d955e262e3389d267e7aa3ae12e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.054919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '253be706-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.246461942, 'message_signature': '180f0d53439e1b8911a1e154d1b546c95982609443dab0dbb86f2de2e3b4e050'}]}, 'timestamp': '2025-12-06 09:44:23.055463', '_unique_id': 'f2347354cd73497f851a29985339430f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.056 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.073 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 52500000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96e5b0a1-e010-4c98-ac8e-c6be990d9d98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52500000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:44:23.056830', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '253ec69c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.284000971, 'message_signature': 'b0a78a708ad4b2c12882a2798f221e6d485679224d37d255a03ee6c8044840fc'}]}, 'timestamp': '2025-12-06 09:44:23.074317', '_unique_id': 'a54111fc432d4641922e1060e88e8645'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.075 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54e0379a-47d4-494e-a08c-8957ee252597', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.075743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '253f0abc-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': 'bb5b24fa65695393bd9af8c78e26d4d3e1c6693f79b0c6491f26f6a1672167d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.075743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '253f152a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': '113c51acb6eb2c7ac6f2bfa18933d4ccdeb05764cfc8eb6247f5f34ae0791847'}]}, 'timestamp': '2025-12-06 09:44:23.076306', '_unique_id': '8caa820ae1cf4cbbbaa9854e4ed499bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.077 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4371f8b-7080-4eb8-a9ec-92fcdb3acf3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:23.077704', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '253f577e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': '23883778749f3a5bc22eedf2b580cfbd07a68afbc964293b661b56a93afe61a3'}]}, 'timestamp': '2025-12-06 09:44:23.078025', '_unique_id': 'a01e225a0345477ebeae7cffd78cb09b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46e777a0-bc98-4923-a09d-d244803b018c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:23.079353', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '253f96f8-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': 'c5964bfe2a31f001baabcf42b66ccc1217d9284fd162e952d0a77c6794f2fe7f'}]}, 'timestamp': '2025-12-06 09:44:23.079647', '_unique_id': '7af8ca054cce4e84a8d5baf7f980ac6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.081 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.081 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.081 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bb1b8eb-f160-4a36-a425-a075e7e8f293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.081186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '253fdf14-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': '438e3c0467ec53ab4d3937dafd0f437e22a8aaff52893d68a9d3ab04d8adbebf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.081186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '253fe93c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': 'ef0230640fd3fa150c61680320f761aac8e35124697e6d77393a736aa4e33d02'}]}, 'timestamp': '2025-12-06 09:44:23.081732', '_unique_id': 'f5d622295bec4d2c9158eb9072976d55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 947163713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 9516486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7083d826-d69f-4a82-9ee9-7514311b4ae5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 947163713, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.083098', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2540291a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': 'e481b2df818de9171b4d8d18bd98e0f5e9b7955eaed8b30f4dbca98eadab4852'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9516486, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.083098', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25403342-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': 'c996b68d769c7382de2d4a13085b20469df6a47f06b670cc78a1e5b87fe8d747'}]}, 'timestamp': '2025-12-06 09:44:23.083627', '_unique_id': '3b39d13ef15c48feb2e15ba03b11a0fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 301237008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 37411248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69bc4992-dd53-42d7-89a4-bbbf4b26a361', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 301237008, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.084994', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2540733e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': 'e28052289ac57d160d605cac07d1fa456b24023437536a131044d35a77554bb3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37411248, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.084994', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25407e10-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': '8b779aac553bf305884897acc2b8c9049ba11c5b9e23b78db8f55d8d9073ea52'}]}, 'timestamp': '2025-12-06 09:44:23.085543', '_unique_id': 'da4a1ac185374876b9f2bd8765b9ff1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.087 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 73904128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.087 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f51b83af-8446-4b9e-812d-ffb4a32a8795', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73904128, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.086987', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2540c0dc-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': '98c10909597eac3ae38ba510a684e1557bf1e5ba82489a516e5dd0cb0bf0102d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.086987', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2540cb04-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.200018208, 'message_signature': '63df87438e3bbadc773a2d67d7fcf9d6372f66b2f59c64d047738cb8e0b1246f'}]}, 'timestamp': '2025-12-06 09:44:23.087514', '_unique_id': 'bc7dce2085e2479aa47a0d6b770d8a7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04a71aba-e345-4823-b558-c08e9069d874', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:44:23.089096', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2541135c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.246461942, 'message_signature': 'c610ca8b7ad40bdab7291ee02a853f8f27f21a3ea03908d4f91d99698f74c18f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:44:23.089096', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25411e6a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.246461942, 'message_signature': '09cccac713dd698e531aeb32d8fe3b527458e14584fd2d72c7c980cd45f34201'}]}, 'timestamp': '2025-12-06 09:44:23.089650', '_unique_id': '8fcb751871314156a69f45bef24bbb53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2adfe2fd-821b-4bd9-ab21-a56ac501d6dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:23.091024', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '25415ede-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': '4cc2d6005d6016fbff2a6d8680e1781a5ba2126887183fc08ec27874482f1d05'}]}, 'timestamp': '2025-12-06 09:44:23.091319', '_unique_id': '73a2d80cbdf548d0a16dee9d50d41e43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.092 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 96 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '780c4ece-13bc-436d-85d0-0c85d52ff15a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 96, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:23.092642', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '25419df4-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': '9b3195e871947a425e2b9665bb5ca88707bfaffc50ff94b9e70ca09041ae042b'}]}, 'timestamp': '2025-12-06 09:44:23.092954', '_unique_id': '2bfae86a4fb141bf8802b735b5c60bad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.094 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.094 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ce0f585-86cd-4748-83f1-6b4bc992a933', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:23.094295', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2541de90-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': '83cf6ab2a26a44c8aeca073b4e66d9e09624ccecd5ddf614f4854b023d6da249'}]}, 'timestamp': '2025-12-06 09:44:23.094588', '_unique_id': '65415e6352c747c689cbfe9d67e1d062'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.095 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 10762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ddc3090-aac2-462b-b8d3-899631df6920', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10762, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:44:23.095918', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '25421dec-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.192620279, 'message_signature': 'c2a600cf37bffdbcbeba1efdca76ffee14c0f1ee552fca9ba0be3f6734be1b87'}]}, 'timestamp': '2025-12-06 09:44:23.096209', '_unique_id': 'c45c11841ec648a89d613a73f5f09f3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.097 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6304ba2-b374-40e1-b33d-01919aaa83f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:44:23.097531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '25425cee-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10650.284000971, 'message_signature': 'eaf34305a3953c7f15480165dbe173f1e316c525439479c9611a0f063b632b59'}]}, 'timestamp': '2025-12-06 09:44:23.097810', '_unique_id': '9efd97652ed142a89676c23aba0eb8fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:44:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:44:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:44:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58470 DF PROTO=TCP SPT=53406 DPT=9105 SEQ=1120045713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA96070000000001030307) Dec 6 04:44:24 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:24 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:24 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:44:24 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Dec 6 04:44:24 localhost nova_compute[187174]: 2025-12-06 09:44:24.222 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:24 localhost sshd[200543]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Dec 6 04:44:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57757 DF PROTO=TCP SPT=38358 DPT=9882 SEQ=3301962100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBA9D870000000001030307) Dec 6 04:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:25 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:26 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:26 localhost nova_compute[187174]: 2025-12-06 09:44:26.872 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:27 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:27 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:44:28 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:28 localhost podman[200545]: 2025-12-06 09:44:28.072403068 +0000 UTC m=+0.092935671 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 04:44:28 localhost podman[200545]: 2025-12-06 09:44:28.115196959 +0000 UTC m=+0.135729592 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:44:28 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27265 DF PROTO=TCP SPT=57038 DPT=9102 SEQ=1173704187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBAA9870000000001030307) Dec 6 04:44:28 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:44:29 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:29 localhost nova_compute[187174]: 2025-12-06 09:44:29.253 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:44:30 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Dec 6 04:44:30 localhost systemd[1]: tmp-crun.sjcAnh.mount: Deactivated successfully. Dec 6 04:44:30 localhost podman[200570]: 2025-12-06 09:44:30.535756699 +0000 UTC m=+0.072817780 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 04:44:30 localhost podman[200570]: 2025-12-06 09:44:30.54842751 +0000 UTC m=+0.085488591 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 04:44:30 localhost systemd[1]: var-lib-containers-storage-overlay-fa2c3e8c20dbc398894a992c8d0207ecad87158b45491e83ea3c6f6e44b17b0b-merged.mount: Deactivated successfully. Dec 6 04:44:30 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:44:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62969 DF PROTO=TCP SPT=47416 DPT=9101 SEQ=2812663518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBAB5070000000001030307) Dec 6 04:44:31 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:44:31 localhost systemd[1]: var-lib-containers-storage-overlay-d969762fb0332c5c36abf270d6236af27b60ab1864e2e74a696d11379e3dcdcb-merged.mount: Deactivated successfully. Dec 6 04:44:31 localhost systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully. Dec 6 04:44:31 localhost systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully. Dec 6 04:44:31 localhost systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully. Dec 6 04:44:31 localhost nova_compute[187174]: 2025-12-06 09:44:31.875 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:44:32 localhost systemd[1]: tmp-crun.1XXfpv.mount: Deactivated successfully. Dec 6 04:44:32 localhost podman[200588]: 2025-12-06 09:44:32.531197421 +0000 UTC m=+0.243871731 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:44:32 localhost podman[200588]: 2025-12-06 09:44:32.562768676 +0000 UTC m=+0.275443006 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:44:32 localhost podman[200588]: unhealthy Dec 6 04:44:32 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:44:33 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:44:33 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:44:33 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Failed with result 'exit-code'. Dec 6 04:44:33 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Dec 6 04:44:33 localhost sshd[200611]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:44:33 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:44:33 localhost systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully. Dec 6 04:44:34 localhost nova_compute[187174]: 2025-12-06 09:44:34.284 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30610 DF PROTO=TCP SPT=39454 DPT=9101 SEQ=4243264164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBAC1870000000001030307) Dec 6 04:44:34 localhost systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully. Dec 6 04:44:35 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:35 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:35 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 6 04:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:44:36 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:36 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:36 localhost systemd[1]: tmp-crun.A9CHRk.mount: Deactivated successfully. Dec 6 04:44:36 localhost podman[200613]: 2025-12-06 09:44:36.580660865 +0000 UTC m=+0.109167281 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 04:44:36 localhost podman[200614]: 2025-12-06 09:44:36.558782351 +0000 UTC m=+0.076688110 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:44:36 localhost podman[200613]: 2025-12-06 09:44:36.614084967 +0000 UTC m=+0.142591383 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 04:44:36 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:36 localhost podman[200614]: 2025-12-06 09:44:36.641223756 +0000 UTC m=+0.159129575 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:44:36 localhost podman[200614]: unhealthy Dec 6 04:44:36 localhost nova_compute[187174]: 2025-12-06 09:44:36.878 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:36 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:44:36 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:44:36 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:44:36 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:36 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62971 DF PROTO=TCP SPT=47416 DPT=9101 SEQ=2812663518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBACCC80000000001030307) Dec 6 04:44:37 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:44:37 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:37 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:37 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:38 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:38 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:38 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:39 localhost nova_compute[187174]: 2025-12-06 09:44:39.322 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:39 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54365 DF PROTO=TCP SPT=48334 DPT=9100 SEQ=2661568147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBAD8080000000001030307) Dec 6 04:44:40 localhost systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully. Dec 6 04:44:40 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:40 localhost systemd[1]: var-lib-containers-storage-overlay-fa2c3e8c20dbc398894a992c8d0207ecad87158b45491e83ea3c6f6e44b17b0b-merged.mount: Deactivated successfully. Dec 6 04:44:40 localhost systemd[1]: var-lib-containers-storage-overlay-fa2c3e8c20dbc398894a992c8d0207ecad87158b45491e83ea3c6f6e44b17b0b-merged.mount: Deactivated successfully. Dec 6 04:44:41 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:41 localhost systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully. Dec 6 04:44:41 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:44:41 localhost systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully. Dec 6 04:44:41 localhost systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully. Dec 6 04:44:41 localhost nova_compute[187174]: 2025-12-06 09:44:41.880 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:43 localhost systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully. Dec 6 04:44:43 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:44:43 localhost systemd[1]: var-lib-containers-storage-overlay-edc3ecbc12a44c3c63d7e19fa2e89c15d1af79c98f2973226367ec81eae57350-merged.mount: Deactivated successfully. Dec 6 04:44:43 localhost systemd[1]: var-lib-containers-storage-overlay-edc3ecbc12a44c3c63d7e19fa2e89c15d1af79c98f2973226367ec81eae57350-merged.mount: Deactivated successfully. Dec 6 04:44:44 localhost nova_compute[187174]: 2025-12-06 09:44:44.325 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:45 localhost systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully. Dec 6 04:44:45 localhost systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully. Dec 6 04:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:44:45 localhost systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully. Dec 6 04:44:45 localhost podman[200644]: 2025-12-06 09:44:45.23455699 +0000 UTC m=+0.095892082 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9) Dec 6 04:44:45 localhost podman[200644]: 2025-12-06 09:44:45.250134481 +0000 UTC m=+0.111469623 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 04:44:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62972 DF PROTO=TCP SPT=47416 DPT=9101 SEQ=2812663518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBAED870000000001030307) Dec 6 04:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:44:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62273 DF PROTO=TCP SPT=53874 DPT=9105 SEQ=2986446191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBAEF750000000001030307) Dec 6 04:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:44:46 localhost systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully. Dec 6 04:44:46 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:44:46 localhost nova_compute[187174]: 2025-12-06 09:44:46.883 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:47 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:44:47 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:44:47 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:44:47 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:44:48 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:44:48 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:48 localhost podman[200663]: 2025-12-06 09:44:48.266120326 +0000 UTC m=+0.067545387 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:44:48 localhost podman[200663]: 2025-12-06 09:44:48.300229849 +0000 UTC m=+0.101654840 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:44:49 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:49 localhost systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully. Dec 6 04:44:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62275 DF PROTO=TCP SPT=53874 DPT=9105 SEQ=2986446191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBAFB870000000001030307) Dec 6 04:44:49 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:44:49 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:49 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:49 localhost nova_compute[187174]: 2025-12-06 09:44:49.368 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:50 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:50 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:50 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:51 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:51 localhost nova_compute[187174]: 2025-12-06 09:44:51.886 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:52 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:44:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2022 DF PROTO=TCP SPT=50106 DPT=9882 SEQ=1657020146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB0B3E0000000001030307) Dec 6 04:44:53 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:53 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:53 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:44:53 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:44:53 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:44:54 localhost nova_compute[187174]: 2025-12-06 09:44:54.371 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:54 localhost systemd[1]: var-lib-containers-storage-overlay-edc3ecbc12a44c3c63d7e19fa2e89c15d1af79c98f2973226367ec81eae57350-merged.mount: Deactivated successfully. Dec 6 04:44:54 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:54 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:55 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:44:55 localhost systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully. Dec 6 04:44:55 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:44:55 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:55 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:44:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19477 DF PROTO=TCP SPT=42452 DPT=9102 SEQ=3169885165 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB15870000000001030307) Dec 6 04:44:56 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:44:56 localhost systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully. Dec 6 04:44:56 localhost systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully. Dec 6 04:44:56 localhost nova_compute[187174]: 2025-12-06 09:44:56.888 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:57 localhost systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully. Dec 6 04:44:57 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:44:57 localhost systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully. Dec 6 04:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:44:58 localhost podman[200685]: 2025-12-06 09:44:58.533914522 +0000 UTC m=+0.065485932 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:44:58 localhost podman[200685]: 2025-12-06 09:44:58.596274597 +0000 UTC m=+0.127845997 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:44:58 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:44:58 localhost systemd[1]: var-lib-containers-storage-overlay-135e32b78dae218c3403324b415101180c4222350114b9184c5da39151cfc014-merged.mount: Deactivated successfully. Dec 6 04:44:58 localhost sshd[200709]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:44:59 localhost systemd-logind[760]: New session 42 of user zuul. Dec 6 04:44:59 localhost systemd[1]: Started Session 42 of User zuul. Dec 6 04:44:59 localhost nova_compute[187174]: 2025-12-06 09:44:59.417 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:44:59 localhost systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully. Dec 6 04:44:59 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:44:59 localhost python3.9[200805]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman Dec 6 04:45:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23015 DF PROTO=TCP SPT=55956 DPT=9101 SEQ=2195413297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB26480000000001030307) Dec 6 04:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:45:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:45:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23016 DF PROTO=TCP SPT=55956 DPT=9101 SEQ=2195413297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB2A470000000001030307) Dec 6 04:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 6 04:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 6 04:45:01 localhost nova_compute[187174]: 2025-12-06 09:45:01.891 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:02 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:02 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:45:02 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:45:02 localhost podman[200819]: 2025-12-06 09:45:02.444242701 +0000 UTC m=+1.195595478 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 04:45:02 localhost podman[200819]: 2025-12-06 09:45:02.523205799 +0000 UTC m=+1.274558576 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:45:03 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:45:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:45:03 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:45:03 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:04 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:04 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:45:04 localhost podman[200838]: 2025-12-06 09:45:04.412323178 +0000 UTC m=+0.575919583 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:45:04 localhost podman[200838]: 2025-12-06 09:45:04.420340806 +0000 UTC m=+0.583937211 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:45:04 localhost nova_compute[187174]: 2025-12-06 09:45:04.421 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:04 localhost podman[200838]: unhealthy Dec 6 04:45:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54367 DF PROTO=TCP SPT=48334 DPT=9100 SEQ=2661568147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB37870000000001030307) Dec 6 04:45:05 localhost python3.9[200969]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:05 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:05 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:05 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:45:05 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Failed with result 'exit-code'. Dec 6 04:45:05 localhost systemd[1]: Started libpod-conmon-da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.scope. Dec 6 04:45:05 localhost podman[200970]: 2025-12-06 09:45:05.621017019 +0000 UTC m=+0.315438881 container exec da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125) Dec 6 04:45:05 localhost podman[200970]: 2025-12-06 09:45:05.651155329 +0000 UTC m=+0.345577161 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:45:06 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:06 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:06 localhost podman[197801]: time="2025-12-06T09:45:06Z" level=error msg="Getting root fs size for \"ed895ed494801c746b03d08fbf86975dd09d64f2a82fa0a78de81b1951c76324\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy" Dec 6 04:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:45:06.665 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:45:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:45:06.665 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:45:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:45:06.667 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:45:06 localhost systemd[1]: libpod-conmon-da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.scope: Deactivated successfully. Dec 6 04:45:06 localhost nova_compute[187174]: 2025-12-06 09:45:06.871 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:06 localhost nova_compute[187174]: 2025-12-06 09:45:06.895 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12976 DF PROTO=TCP SPT=60964 DPT=9100 SEQ=1771890542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB41880000000001030307) Dec 6 04:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:45:07 localhost podman[201057]: 2025-12-06 09:45:07.323033482 +0000 UTC m=+0.092295321 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 04:45:07 localhost podman[201056]: 2025-12-06 09:45:07.382368094 +0000 UTC m=+0.153964424 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:45:07 localhost podman[201056]: 2025-12-06 09:45:07.391454265 +0000 UTC m=+0.163050555 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible) Dec 6 04:45:07 localhost podman[201057]: 2025-12-06 09:45:07.40718751 +0000 UTC m=+0.176449329 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:45:07 localhost podman[201057]: unhealthy Dec 6 04:45:07 localhost python3.9[201143]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:07 localhost nova_compute[187174]: 2025-12-06 09:45:07.870 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 6 04:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-4ce8c2c866bc20ee02a745aae8c22be484c8b579141c525f40381f8b1c563bd1-merged.mount: Deactivated successfully. Dec 6 04:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-4ce8c2c866bc20ee02a745aae8c22be484c8b579141c525f40381f8b1c563bd1-merged.mount: Deactivated successfully. Dec 6 04:45:08 localhost nova_compute[187174]: 2025-12-06 09:45:08.874 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:08 localhost nova_compute[187174]: 2025-12-06 09:45:08.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 6 04:45:09 localhost systemd[1]: var-lib-containers-storage-overlay-135e32b78dae218c3403324b415101180c4222350114b9184c5da39151cfc014-merged.mount: Deactivated successfully. Dec 6 04:45:09 localhost systemd[1]: var-lib-containers-storage-overlay-135e32b78dae218c3403324b415101180c4222350114b9184c5da39151cfc014-merged.mount: Deactivated successfully. Dec 6 04:45:09 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:45:09 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:45:09 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Failed with result 'exit-code'. Dec 6 04:45:09 localhost systemd[1]: Started libpod-conmon-da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.scope. Dec 6 04:45:09 localhost podman[201144]: 2025-12-06 09:45:09.300892542 +0000 UTC m=+1.683758280 container exec da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:45:09 localhost podman[201144]: 2025-12-06 09:45:09.30926587 +0000 UTC m=+1.692131678 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:45:09 localhost nova_compute[187174]: 2025-12-06 09:45:09.464 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:09 localhost nova_compute[187174]: 2025-12-06 09:45:09.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:09 localhost nova_compute[187174]: 2025-12-06 09:45:09.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:45:09 localhost nova_compute[187174]: 2025-12-06 09:45:09.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:10 localhost nova_compute[187174]: 2025-12-06 09:45:10.134 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:45:10 localhost nova_compute[187174]: 2025-12-06 09:45:10.135 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:45:10 localhost nova_compute[187174]: 2025-12-06 09:45:10.135 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:45:10 localhost nova_compute[187174]: 2025-12-06 09:45:10.135 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29380 DF PROTO=TCP SPT=33608 DPT=9100 SEQ=2598838990 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB4D470000000001030307) Dec 6 04:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.058 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:45:11 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 6 04:45:11 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:11 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.103 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.104 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.105 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.106 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.106 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.107 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.108 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.128 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.129 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.129 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.130 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.190 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.235 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.236 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.319 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.321 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.385 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.387 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.432 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.639 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.641 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12801MB free_disk=387.3032913208008GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.641 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.641 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:45:11 localhost python3.9[201295]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.798 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.798 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.799 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.896 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:11 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:11 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.966 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.988 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.990 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:45:11 localhost nova_compute[187174]: 2025-12-06 09:45:11.991 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:45:12 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:12 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:45:12 localhost python3.9[201406]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman Dec 6 04:45:12 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 6 04:45:12 localhost systemd[1]: libpod-conmon-da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.scope: Deactivated successfully. Dec 6 04:45:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:12 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:13 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:13 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:13 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:13 localhost nova_compute[187174]: 2025-12-06 09:45:13.759 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:45:13 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:14 localhost nova_compute[187174]: 2025-12-06 09:45:14.467 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-e783b457a0dbabf469167b8c7c7fc00f0087efa2180c519aab4a9fcb73c3a343-merged.mount: Deactivated successfully. Dec 6 04:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:14 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:14 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:14 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:14 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:15 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23019 DF PROTO=TCP SPT=55956 DPT=9101 SEQ=2195413297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB61870000000001030307) Dec 6 04:45:15 localhost python3.9[201529]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:15 localhost systemd[1]: Started libpod-conmon-34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.scope. Dec 6 04:45:16 localhost podman[201530]: 2025-12-06 09:45:16.004696124 +0000 UTC m=+0.119921414 container exec 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:45:16 localhost podman[201530]: 2025-12-06 09:45:16.039495679 +0000 UTC m=+0.154720939 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent) Dec 6 04:45:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62576 DF PROTO=TCP SPT=53612 DPT=9105 SEQ=2199137005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB64A50000000001030307) Dec 6 04:45:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:45:16 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 6 04:45:16 localhost systemd[1]: var-lib-containers-storage-overlay-4ce8c2c866bc20ee02a745aae8c22be484c8b579141c525f40381f8b1c563bd1-merged.mount: Deactivated successfully. Dec 6 04:45:16 localhost nova_compute[187174]: 2025-12-06 09:45:16.899 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:17 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:17 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:45:17 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:45:17 localhost podman[201556]: 2025-12-06 09:45:17.487141127 +0000 UTC m=+1.018859100 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public) Dec 6 04:45:17 localhost podman[201556]: 2025-12-06 09:45:17.498700234 +0000 UTC m=+1.030418277 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter) Dec 6 04:45:18 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:18 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:18 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:18 localhost python3.9[201685]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:19 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:19 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62578 DF PROTO=TCP SPT=53612 DPT=9105 SEQ=2199137005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB70C70000000001030307) Dec 6 04:45:19 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:45:19 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:19 localhost systemd[1]: libpod-conmon-34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.scope: Deactivated successfully. Dec 6 04:45:19 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:45:19 localhost systemd[1]: Started libpod-conmon-34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.scope. Dec 6 04:45:19 localhost nova_compute[187174]: 2025-12-06 09:45:19.523 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:19 localhost podman[201686]: 2025-12-06 09:45:19.52941474 +0000 UTC m=+1.232760459 container exec 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:45:19 localhost podman[201697]: 2025-12-06 09:45:19.550609744 +0000 UTC m=+0.130188804 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:45:19 localhost podman[201697]: 2025-12-06 09:45:19.583136849 +0000 UTC m=+0.162715939 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:45:19 localhost podman[201686]: 2025-12-06 09:45:19.613191036 +0000 UTC m=+1.316536755 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 04:45:20 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:20 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:20 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:20 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:20 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:45:20 localhost systemd[1]: libpod-conmon-34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.scope: Deactivated successfully. Dec 6 04:45:21 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:21 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:21 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:21 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:21 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:21 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:21 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:21 localhost python3.9[201850]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:21 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:21 localhost nova_compute[187174]: 2025-12-06 09:45:21.903 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:21 localhost python3.9[201960]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman Dec 6 04:45:22 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:22 localhost systemd[1]: var-lib-containers-storage-overlay-e783b457a0dbabf469167b8c7c7fc00f0087efa2180c519aab4a9fcb73c3a343-merged.mount: Deactivated successfully. Dec 6 04:45:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57091 DF PROTO=TCP SPT=41178 DPT=9882 SEQ=50166600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB806E0000000001030307) Dec 6 04:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-7603e0cea2115c7f1b9c23d551d73dbd4a38d1911aefa6f002a0812093908818-merged.mount: Deactivated successfully. Dec 6 04:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-7603e0cea2115c7f1b9c23d551d73dbd4a38d1911aefa6f002a0812093908818-merged.mount: Deactivated successfully. Dec 6 04:45:24 localhost nova_compute[187174]: 2025-12-06 09:45:24.546 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:25 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:25 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:25 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46855 DF PROTO=TCP SPT=38814 DPT=9102 SEQ=4202481553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB89870000000001030307) Dec 6 04:45:26 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:26 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:45:26 localhost nova_compute[187174]: 2025-12-06 09:45:26.905 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:26 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:45:26 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:26 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:27 localhost python3.9[202083]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:27 localhost systemd[1]: Started libpod-conmon-b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.scope. Dec 6 04:45:27 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:27 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:27 localhost podman[202084]: 2025-12-06 09:45:27.766544312 +0000 UTC m=+0.111581715 container exec b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd) Dec 6 04:45:27 localhost podman[202084]: 2025-12-06 09:45:27.799209194 +0000 UTC m=+0.144246587 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Dec 6 04:45:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19478 DF PROTO=TCP SPT=42452 DPT=9102 SEQ=3169885165 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB93870000000001030307) Dec 6 04:45:28 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:28 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:28 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 6 04:45:29 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:29 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:29 localhost nova_compute[187174]: 2025-12-06 09:45:29.593 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:29 localhost python3.9[202222]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:29 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:29 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:45:30 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:30 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 6 04:45:30 localhost systemd[1]: libpod-conmon-b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.scope: Deactivated successfully. Dec 6 04:45:30 localhost systemd[1]: Started libpod-conmon-b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.scope. Dec 6 04:45:30 localhost podman[202223]: 2025-12-06 09:45:30.228171264 +0000 UTC m=+0.570526643 container exec b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 04:45:30 localhost podman[202223]: 2025-12-06 09:45:30.232116769 +0000 UTC m=+0.574472188 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:45:30 localhost podman[202235]: 2025-12-06 09:45:30.282795111 +0000 UTC m=+0.257553751 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Dec 6 04:45:30 localhost podman[202235]: 2025-12-06 09:45:30.361363873 +0000 UTC m=+0.336122523 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:45:30 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:30 localhost systemd[1]: var-lib-containers-storage-overlay-dc6ccae8d1859158b3bbd7185cd50b8ac3a8ab8c86ff1ef056ca16ec9c2e0699-merged.mount: Deactivated successfully. Dec 6 04:45:30 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:30 localhost kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 6 04:45:30 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:30 localhost podman[197801]: time="2025-12-06T09:45:30Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\"" Dec 6 04:45:30 localhost podman[197801]: @ - - [06/Dec/2025:09:40:44 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1" Dec 6 04:45:30 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:45:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64493 DF PROTO=TCP SPT=34118 DPT=9101 SEQ=2820758079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBB9F870000000001030307) Dec 6 04:45:31 localhost systemd[1]: libpod-conmon-b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.scope: Deactivated successfully. Dec 6 04:45:31 localhost nova_compute[187174]: 2025-12-06 09:45:31.907 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:32 localhost python3.9[202385]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:33 localhost python3.9[202495]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman Dec 6 04:45:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62974 DF PROTO=TCP SPT=47416 DPT=9101 SEQ=2812663518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBAB880000000001030307) Dec 6 04:45:34 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 6 04:45:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:45:34 localhost systemd[1]: var-lib-containers-storage-overlay-7603e0cea2115c7f1b9c23d551d73dbd4a38d1911aefa6f002a0812093908818-merged.mount: Deactivated successfully. Dec 6 04:45:34 localhost systemd[1]: var-lib-containers-storage-overlay-7603e0cea2115c7f1b9c23d551d73dbd4a38d1911aefa6f002a0812093908818-merged.mount: Deactivated successfully. Dec 6 04:45:34 localhost nova_compute[187174]: 2025-12-06 09:45:34.644 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:34 localhost podman[202507]: 2025-12-06 09:45:34.698123064 +0000 UTC m=+0.289326647 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:45:34 localhost podman[202507]: 2025-12-06 09:45:34.712170287 +0000 UTC m=+0.303373920 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 04:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 6 04:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:35 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:45:36 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:45:36 localhost python3.9[202637]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:36 localhost podman[202638]: 2025-12-06 09:45:36.199279396 +0000 UTC m=+0.209508797 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:45:36 localhost podman[202638]: 2025-12-06 09:45:36.212141705 +0000 UTC m=+0.222371056 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:45:36 localhost podman[202638]: unhealthy Dec 6 04:45:36 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:36 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 6 04:45:36 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Main process exited, code=exited, status=1/FAILURE Dec 6 04:45:36 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Failed with result 'exit-code'. Dec 6 04:45:36 localhost systemd[1]: Started libpod-conmon-e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.scope. Dec 6 04:45:36 localhost podman[202649]: 2025-12-06 09:45:36.632623661 +0000 UTC m=+0.455130587 container exec e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 04:45:36 localhost podman[202649]: 2025-12-06 09:45:36.661098869 +0000 UTC m=+0.483605785 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:45:36 localhost nova_compute[187174]: 2025-12-06 09:45:36.916 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:37 localhost podman[197801]: @ - - [06/Dec/2025:09:40:48 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 137270 "" "Go-http-client/1.1" Dec 6 04:45:37 localhost podman_exporter[198049]: ts=2025-12-06T09:45:37.279Z caller=exporter.go:96 level=info msg="Listening on" address=:9882 Dec 6 04:45:37 localhost podman_exporter[198049]: ts=2025-12-06T09:45:37.280Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882 Dec 6 04:45:37 localhost podman_exporter[198049]: ts=2025-12-06T09:45:37.280Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882 Dec 6 04:45:37 localhost systemd[1]: libpod-conmon-e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.scope: Deactivated successfully. Dec 6 04:45:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64495 DF PROTO=TCP SPT=34118 DPT=9101 SEQ=2820758079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBB7470000000001030307) Dec 6 04:45:37 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 6 04:45:37 localhost systemd[1]: var-lib-containers-storage-overlay-dc6ccae8d1859158b3bbd7185cd50b8ac3a8ab8c86ff1ef056ca16ec9c2e0699-merged.mount: Deactivated successfully. Dec 6 04:45:38 localhost python3.9[202800]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:38 localhost systemd[1]: Started libpod-conmon-e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.scope. Dec 6 04:45:38 localhost podman[202801]: 2025-12-06 09:45:38.213800268 +0000 UTC m=+0.110637927 container exec e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 04:45:38 localhost podman[202801]: 2025-12-06 09:45:38.246269944 +0000 UTC m=+0.143107573 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Dec 6 04:45:38 localhost systemd[1]: libpod-conmon-e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.scope: Deactivated successfully. Dec 6 04:45:39 localhost python3.9[202940]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:45:39 localhost podman[202942]: 2025-12-06 09:45:39.553482767 +0000 UTC m=+0.085108985 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:45:39 localhost podman[202942]: 2025-12-06 09:45:39.561493173 +0000 UTC m=+0.093119421 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Dec 6 04:45:39 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:45:39 localhost podman[202941]: 2025-12-06 09:45:39.609998951 +0000 UTC m=+0.144091572 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 04:45:39 localhost podman[202941]: 2025-12-06 09:45:39.619157441 +0000 UTC m=+0.153250132 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:45:39 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:45:39 localhost nova_compute[187174]: 2025-12-06 09:45:39.685 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:40 localhost python3.9[203088]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman Dec 6 04:45:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25076 DF PROTO=TCP SPT=43602 DPT=9100 SEQ=4218902480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBC2880000000001030307) Dec 6 04:45:40 localhost python3.9[203211]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:41 localhost systemd[1]: Started libpod-conmon-979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.scope. Dec 6 04:45:41 localhost podman[203212]: 2025-12-06 09:45:41.015899259 +0000 UTC m=+0.122859166 container exec 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:45:41 localhost podman[203212]: 2025-12-06 09:45:41.102035335 +0000 UTC m=+0.208995262 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:45:41 localhost systemd[1]: libpod-conmon-979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.scope: Deactivated successfully. Dec 6 04:45:41 localhost python3.9[203352]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:41 localhost systemd[1]: Started libpod-conmon-979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.scope. Dec 6 04:45:41 localhost podman[203353]: 2025-12-06 09:45:41.887890745 +0000 UTC m=+0.085465487 container exec 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:45:41 localhost podman[203353]: 2025-12-06 09:45:41.921224196 +0000 UTC m=+0.118798868 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:45:41 localhost nova_compute[187174]: 2025-12-06 09:45:41.939 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:41 localhost systemd[1]: libpod-conmon-979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.scope: Deactivated successfully. Dec 6 04:45:42 localhost python3.9[203492]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:43 localhost python3.9[203602]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman Dec 6 04:45:43 localhost python3.9[203725]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:43 localhost systemd[1]: Started libpod-conmon-4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.scope. Dec 6 04:45:43 localhost podman[203726]: 2025-12-06 09:45:43.988599413 +0000 UTC m=+0.091647778 container exec 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:45:44 localhost podman[203726]: 2025-12-06 09:45:44.020273015 +0000 UTC m=+0.123321400 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:45:44 localhost systemd[1]: libpod-conmon-4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.scope: Deactivated successfully. Dec 6 04:45:44 localhost nova_compute[187174]: 2025-12-06 09:45:44.737 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:44 localhost python3.9[203866]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:45 localhost systemd[1]: Started libpod-conmon-4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.scope. Dec 6 04:45:45 localhost podman[203867]: 2025-12-06 09:45:45.159581139 +0000 UTC m=+0.206495579 container exec 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:45:45 localhost podman[203867]: 2025-12-06 09:45:45.191282701 +0000 UTC m=+0.238197101 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:45:45 localhost systemd[1]: libpod-conmon-4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.scope: Deactivated successfully. Dec 6 04:45:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64496 DF PROTO=TCP SPT=34118 DPT=9101 SEQ=2820758079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBD7880000000001030307) Dec 6 04:45:45 localhost python3.9[204006]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29596 DF PROTO=TCP SPT=58834 DPT=9105 SEQ=1008943330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBD9D50000000001030307) Dec 6 04:45:46 localhost python3.9[204116]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman Dec 6 04:45:46 localhost nova_compute[187174]: 2025-12-06 09:45:46.987 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:47 localhost python3.9[204239]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:47 localhost systemd[1]: Started libpod-conmon-0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.scope. Dec 6 04:45:47 localhost podman[204240]: 2025-12-06 09:45:47.422886933 +0000 UTC m=+0.099583522 container exec 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64) Dec 6 04:45:47 localhost podman[204240]: 2025-12-06 09:45:47.456265534 +0000 UTC m=+0.132962203 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.) Dec 6 04:45:47 localhost systemd[1]: libpod-conmon-0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.scope: Deactivated successfully. Dec 6 04:45:48 localhost python3.9[204376]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Dec 6 04:45:48 localhost systemd[1]: Started libpod-conmon-0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.scope. Dec 6 04:45:48 localhost podman[204377]: 2025-12-06 09:45:48.229742261 +0000 UTC m=+0.104926820 container exec 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350) Dec 6 04:45:48 localhost podman[204377]: 2025-12-06 09:45:48.261240227 +0000 UTC m=+0.136424736 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 04:45:48 localhost systemd[1]: libpod-conmon-0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.scope: Deactivated successfully. Dec 6 04:45:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29598 DF PROTO=TCP SPT=58834 DPT=9105 SEQ=1008943330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBE5C70000000001030307) Dec 6 04:45:49 localhost python3.9[204518]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:49 localhost nova_compute[187174]: 2025-12-06 09:45:49.778 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:45:50 localhost podman[204629]: 2025-12-06 09:45:50.408896546 +0000 UTC m=+0.092344639 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 04:45:50 localhost podman[204629]: 2025-12-06 09:45:50.426176476 +0000 UTC m=+0.109624559 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.) Dec 6 04:45:50 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:45:50 localhost python3.9[204628]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:45:51 localhost podman[204718]: 2025-12-06 09:45:51.581658164 +0000 UTC m=+0.112471781 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:45:51 localhost podman[204718]: 2025-12-06 09:45:51.648276705 +0000 UTC m=+0.179090282 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:45:51 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:45:51 localhost python3.9[204780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:45:52 localhost nova_compute[187174]: 2025-12-06 09:45:52.055 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:52 localhost python3.9[204868]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014351.3424935-3215-66850641780999/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:53 localhost python3.9[204978]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29599 DF PROTO=TCP SPT=58834 DPT=9105 SEQ=1008943330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBF5870000000001030307) Dec 6 04:45:53 localhost python3.9[205088]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:45:54 localhost python3.9[205145]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:54 localhost nova_compute[187174]: 2025-12-06 09:45:54.813 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:55 localhost python3.9[205255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:45:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57096 DF PROTO=TCP SPT=41178 DPT=9882 SEQ=50166600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBBFD870000000001030307) Dec 6 04:45:55 localhost python3.9[205312]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.h77qu69b recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:56 localhost python3.9[205422]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:45:56 localhost python3.9[205479]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:57 localhost nova_compute[187174]: 2025-12-06 09:45:57.112 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:45:57 localhost python3.9[205589]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:45:58 localhost python3[205700]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Dec 6 04:45:58 localhost python3.9[205810]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:45:59 localhost python3.9[205867]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:45:59 localhost nova_compute[187174]: 2025-12-06 09:45:59.852 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8527 DF PROTO=TCP SPT=45338 DPT=9101 SEQ=2333506476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC10A70000000001030307) Dec 6 04:46:00 localhost python3.9[205977]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:00 localhost python3.9[206034]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8528 DF PROTO=TCP SPT=45338 DPT=9101 SEQ=2333506476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC14C80000000001030307) Dec 6 04:46:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:46:01 localhost podman[206052]: 2025-12-06 09:46:01.592776254 +0000 UTC m=+0.125712271 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:46:01 localhost podman[206052]: 2025-12-06 09:46:01.632695669 +0000 UTC m=+0.165631726 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 04:46:01 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:46:02 localhost nova_compute[187174]: 2025-12-06 09:46:02.169 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:02 localhost python3.9[206169]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:02 localhost python3.9[206226]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:03 localhost python3.9[206336]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:03 localhost python3.9[206393]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:04 localhost python3.9[206503]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:04 localhost nova_compute[187174]: 2025-12-06 09:46:04.894 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25078 DF PROTO=TCP SPT=43602 DPT=9100 SEQ=4218902480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC23870000000001030307) Dec 6 04:46:05 localhost python3.9[206593]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014364.1791844-3590-70071471628906/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:06 localhost python3.9[206703]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:46:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:46:06 localhost podman[206773]: 2025-12-06 09:46:06.642301062 +0000 UTC m=+0.166977805 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:46:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:46:06.666 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:46:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:46:06.666 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:46:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:46:06.668 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:46:06 localhost podman[206773]: 2025-12-06 09:46:06.730568931 +0000 UTC m=+0.255245634 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 04:46:06 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:46:06 localhost podman[206831]: 2025-12-06 09:46:06.83555856 +0000 UTC m=+0.183193042 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:46:06 localhost python3.9[206830]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:46:06 localhost podman[206831]: 2025-12-06 09:46:06.870288933 +0000 UTC m=+0.217923415 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:46:06 localhost nova_compute[187174]: 2025-12-06 09:46:06.871 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:06 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:46:07 localhost nova_compute[187174]: 2025-12-06 09:46:07.208 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8530 DF PROTO=TCP SPT=45338 DPT=9101 SEQ=2333506476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC2C870000000001030307) Dec 6 04:46:07 localhost python3.9[206965]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:08 localhost python3.9[207075]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:46:09 localhost python3.9[207186]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.924 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.925 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.925 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.926 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.927 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:09 localhost nova_compute[187174]: 2025-12-06 09:46:09.995 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.068 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.070 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.122 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.123 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.193 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.194 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:46:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45427 DF PROTO=TCP SPT=42490 DPT=9100 SEQ=1462525898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC37C70000000001030307) Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.245 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.480 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.482 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12853MB free_disk=387.3019371032715GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.483 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.483 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:46:10 localhost podman[207312]: 2025-12-06 09:46:10.534103426 +0000 UTC m=+0.089967569 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:46:10 localhost podman[207312]: 2025-12-06 09:46:10.566379516 +0000 UTC m=+0.122243729 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.585 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.586 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.586 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:46:10 localhost podman[207311]: 2025-12-06 09:46:10.5899434 +0000 UTC m=+0.147850302 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:46:10 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:46:10 localhost python3.9[207310]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:46:10 localhost podman[207311]: 2025-12-06 09:46:10.624138437 +0000 UTC m=+0.182045309 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.631 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:46:10 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.645 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.647 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:46:10 localhost nova_compute[187174]: 2025-12-06 09:46:10.648 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:46:11 localhost python3.9[207458]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:11 localhost nova_compute[187174]: 2025-12-06 09:46:11.647 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:11 localhost nova_compute[187174]: 2025-12-06 09:46:11.648 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:46:11 localhost nova_compute[187174]: 2025-12-06 09:46:11.649 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:46:12 localhost nova_compute[187174]: 2025-12-06 09:46:12.170 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:46:12 localhost nova_compute[187174]: 2025-12-06 09:46:12.170 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:46:12 localhost nova_compute[187174]: 2025-12-06 09:46:12.171 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:46:12 localhost nova_compute[187174]: 2025-12-06 09:46:12.171 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:46:12 localhost nova_compute[187174]: 2025-12-06 09:46:12.248 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:12 localhost systemd[1]: session-42.scope: Deactivated successfully. Dec 6 04:46:12 localhost systemd[1]: session-42.scope: Consumed 29.896s CPU time. Dec 6 04:46:12 localhost systemd-logind[760]: Session 42 logged out. Waiting for processes to exit. Dec 6 04:46:12 localhost systemd-logind[760]: Removed session 42. Dec 6 04:46:14 localhost nova_compute[187174]: 2025-12-06 09:46:14.994 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.123 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.147 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.148 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.148 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.149 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.149 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.150 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.150 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:46:16 localhost nova_compute[187174]: 2025-12-06 09:46:16.151 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:46:16 localhost openstack_network_exporter[199751]: ERROR 09:46:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:46:16 localhost openstack_network_exporter[199751]: ERROR 09:46:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:46:16 localhost openstack_network_exporter[199751]: ERROR 09:46:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:46:16 localhost openstack_network_exporter[199751]: ERROR 09:46:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:46:16 localhost openstack_network_exporter[199751]: Dec 6 04:46:16 localhost openstack_network_exporter[199751]: ERROR 09:46:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:46:16 localhost openstack_network_exporter[199751]: Dec 6 04:46:17 localhost nova_compute[187174]: 2025-12-06 09:46:17.300 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:17 localhost sshd[207482]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:46:18 localhost systemd-logind[760]: New session 43 of user zuul. Dec 6 04:46:18 localhost systemd[1]: Started Session 43 of User zuul. Dec 6 04:46:19 localhost python3.9[207595]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:19 localhost python3.9[207705]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:20 localhost nova_compute[187174]: 2025-12-06 09:46:20.043 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:20 localhost python3.9[207815]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:46:20 localhost podman[207849]: 2025-12-06 09:46:20.560492576 +0000 UTC m=+0.087670692 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, release=1755695350, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.) Dec 6 04:46:20 localhost podman[207849]: 2025-12-06 09:46:20.603270145 +0000 UTC m=+0.130448221 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc.) Dec 6 04:46:20 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:46:22 localhost python3.9[207943]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:22 localhost nova_compute[187174]: 2025-12-06 09:46:22.335 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:46:22 localhost podman[207993]: 2025-12-06 09:46:22.576943964 +0000 UTC m=+0.101735205 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:46:22 localhost podman[207993]: 2025-12-06 09:46:22.608216474 +0000 UTC m=+0.133007705 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:46:22 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:46:22 localhost python3.9[208050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014380.4607754-104-187626258662220/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:22.984 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:46:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:22.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.038 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.039 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8006d1e5-a19d-4f39-92ce-591b656cbecb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:22.985164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6cbfe582-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': 'f45680b576d101414027a6be0a63d1b7131126dce85fe860e64a4342d65673ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:22.985164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6cbfffd6-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': '9604da0db00bb8c02a04eb14f253986ca3237173efd9c1418cc1eeca71b6f837'}]}, 'timestamp': '2025-12-06 09:46:23.039515', '_unique_id': '9121aa48ad244821af1628020ce4137f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.042 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.042 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '417b9f22-26cb-4c9b-96a2-bf9850e17bec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.042498', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6cc08708-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': 'd372bfae21f968a29f1a8f734c29adea4649665fe3dc01fe6360d2179586fe1c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.042498', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6cc099fa-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': 'bedc34573a2e7c854e2305f93de82c0d791b4e8094f3ac653865f237f24a0b24'}]}, 'timestamp': '2025-12-06 09:46:23.043437', '_unique_id': 'cf2716ce3e404042bd19822a5e8b2144'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.044 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.045 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 73904128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.046 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85550e2b-e1d5-4066-abca-47211cc3fb84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73904128, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.045739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6cc10660-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': 'd961f985fa2e5516a381aebb628ce1c202b2239383d4b233aefbf021c0e5fc70'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.045739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6cc116d2-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': 'a83d3fb9d78783ea3c4b9d5332e4e09a6c171a6ad40187222bb1322a5db9fa2b'}]}, 'timestamp': '2025-12-06 09:46:23.046627', '_unique_id': '23e5eeb79cd5417d890489a30da5b77f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.054 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d48a11e-7b31-4bec-b95a-e5dc2e52c78a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.049143', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6cc24caa-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': '301bb4d6fccc5933b9c04f796a428358735da64bf8dbfc55def61c547af83192'}]}, 'timestamp': '2025-12-06 09:46:23.054609', '_unique_id': '7a59d16adad94acc89ab26af23640a6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.055 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.057 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48622112-7511-4efa-b8c2-2766a5c159ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.057168', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6cc5b002-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.26729125, 'message_signature': 'ee4a63c6817fddf7a705f41423c7a1aec207bf95a7d69f80f130a2db450c374e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.057168', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6cc5c420-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.26729125, 'message_signature': '679e65f8cd2877f2eb7559e8ce5f76b0e278aacefaaa7a5d994ecf17b3b3b27b'}]}, 'timestamp': '2025-12-06 09:46:23.077286', '_unique_id': '37780ec8f3244afe8f0424a0aeddea57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d3f37c9-810e-4cb5-aace-cc167bfe7b09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.080217', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6cc64896-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': '09799fc0c058f241ab41db35cb62f6f87391dd597582872b8c2f5c6568d65929'}]}, 'timestamp': '2025-12-06 09:46:23.080701', '_unique_id': '7876f7fc170f469c92bedce12479198f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 53580000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14a24edb-3311-42ab-8872-1ba0a68d28f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53580000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:46:23.082941', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6cca5b5c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.316738065, 'message_signature': '2959c7b8772ffe90d1dd64dce9fa290752783c5a1bd54da058dc0aa9919e8d3f'}]}, 'timestamp': '2025-12-06 09:46:23.107466', '_unique_id': '1b82f2beac4247f5b219a6599d9b3126'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.110 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.111 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15e0a115-b11d-4110-9dbe-3c878e52e408', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.110234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6ccae3d8-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.26729125, 'message_signature': '4a43b6109612ab639498e9c102c738426b9a0bab9eb9f7fb981de7183c194bb6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.110234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6ccb0160-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.26729125, 'message_signature': 'bc05461dfd1b5d9d51aa5875b266e2b07e71bd40e145e09b03137a74f11a46a0'}]}, 'timestamp': '2025-12-06 09:46:23.111758', '_unique_id': 'd80c745fb5224a7694116e586f1095bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.114 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ef5408c-3817-41e5-bba4-363b29256dde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.114763', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6ccb947c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': '4e38aecbbb22d166d7b9be4441776ef2fbbe34b0626562408a19f71a840ddd3e'}]}, 'timestamp': '2025-12-06 09:46:23.115516', '_unique_id': 'cd0c73a6d632430a96c61dcef7c98c35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fb30ab1-ccab-42f7-aeb6-ba6fb26e413c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.118351', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6ccc1abe-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': '529129ed580b50508592a18a13aa4689962127dadb5e68d327908a2534e1955d'}]}, 'timestamp': '2025-12-06 09:46:23.118998', '_unique_id': '14a11c3015bc4a65b0e6857a97de8413'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31064064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5be853a0-65e0-4071-ab1a-f3e3a5a74b2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31064064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.121495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6ccc953e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.26729125, 'message_signature': '3c117f366dd53b39becea136f0bb90833e6b2102c6c5995557a6cdb1a3c143d8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.121495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6ccca966-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.26729125, 'message_signature': '2edad2d73a13710ebaee76d9508372eb33db7e4f0ced9942065950fca44a734f'}]}, 'timestamp': '2025-12-06 09:46:23.122488', '_unique_id': 'e3486d9b71e547ee9686261d389626ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 301237008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 37411248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '579433da-3aa1-49f8-b84a-82f261126e17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 301237008, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.125025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6ccd1ed2-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': '0619bdf9169a48d6a84b25ad754df9639598589f05417f6647cbef07ba002716'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37411248, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.125025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6ccd3020-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': '9b7dd846bd3a178f946136775b0cb180b351e7247e1eb2e9cd058686116016c3'}]}, 'timestamp': '2025-12-06 09:46:23.125961', '_unique_id': 'e0613727093e4b658f395e5e2503dd2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.128 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 947163713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.129 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 9516486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dff50c7e-53d0-4a23-95f7-825d2f8d841d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 947163713, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.128488', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6ccda64a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': 'e3fb2fea6213b639b8034edaa0c46fb45f2510302236e9e610e44992524be2f1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9516486, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.128488', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6ccdb91e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': '81eb96684a9ad742986bba48822cc27994dbaefca7858e7ef6b10d731cc66732'}]}, 'timestamp': '2025-12-06 09:46:23.129437', '_unique_id': '0abf0ff65bb34fe4b93533210d4a6e1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5e21d06-77db-4200-9327-236bd4018b5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.132588', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6cce4c1c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': 'c3365ca9c9fad12efb3a4efae547f82bb6ff1da37a593f3fc2098689327998f9'}]}, 'timestamp': '2025-12-06 09:46:23.133288', '_unique_id': '354bc511c6764e05b21dcd2463f67a8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.136 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 10762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06582497-0b22-4883-9a2b-d4a424846225', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10762, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.136524', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6ccedd62-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': '2597af86afe425200b9a12fa382296950ea6cb1503571db20f77e21d74ca79ce'}]}, 'timestamp': '2025-12-06 09:46:23.136913', '_unique_id': '9cc5dc40441f4fe698ecb170b569d6de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.138 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.138 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 96 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35d6668b-003f-4e12-9c93-172a288e2c46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 96, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.138470', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6ccf2862-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': 'f0c9fb844a8634d6883d0cacce3dafc63c65379dba341ab9793abcf254860ad2'}]}, 'timestamp': '2025-12-06 09:46:23.138797', '_unique_id': '4c337a241fed439485d076d675ac55cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.140 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef67ba35-29eb-4e9b-9dee-1b9a6ff3f1cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.140217', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6ccf6da4-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': 'f29a0d12f3c0c6c1a40831b586b1865af9b2c32b06a90c7abf86ac6f502ca64c'}]}, 'timestamp': '2025-12-06 09:46:23.140544', '_unique_id': '38b67387f48242e6b64d7d29d47fd63e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cd58115-d22d-486b-a587-e61de7605a19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.142003', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6ccfb23c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': '60a974d7796bbab5700237afba95d68779407b40ec1bc1eebfa0ce436a755a68'}]}, 'timestamp': '2025-12-06 09:46:23.142299', '_unique_id': 'fdc0f22abc374685a67b7d439828b0f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.142 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.143 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fc5b8c7-4ec0-47b9-91f1-a1fc9161a368', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:46:23.143755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6ccff80a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': '478c1a7ad4736ecf49c702f29ab4386786a4415cfa8d4f076611adcba6ff722a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:46:23.143755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6cd002b4-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.1952754, 'message_signature': 'f030ca42da4289b3fa9dfa59e9a1ef321c081b727f0ec6321aa008844f9b806c'}]}, 'timestamp': '2025-12-06 09:46:23.144339', '_unique_id': 'efa30a79b67142899340a6761ba9cf0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.144 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.145 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.145 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4247d98-b6fa-4966-b20d-613db1c23e7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:46:23.145751', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6cd0462a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.316738065, 'message_signature': '0a1bb7d7b645b9ff04750a01f8827cbe08238b5dd220e056712603d102e029ec'}]}, 'timestamp': '2025-12-06 09:46:23.146105', '_unique_id': '65ae3b3107364bd588a40e7d52c02dbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.146 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.147 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 9699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a04e9cc-0473-46c5-a788-85fd99cd5f89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9699, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:46:23.147651', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6cd08fa4-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10770.259293444, 'message_signature': '084f1ad4bba111a1591f77350165add2adcae915faf3cad80a066cf587528873'}]}, 'timestamp': '2025-12-06 09:46:23.147989', '_unique_id': 'ccec4d7f46d3472297e399f89b85ecfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:46:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:46:23.148 12 ERROR oslo_messaging.notify.messaging Dec 6 04:46:23 localhost podman[197801]: time="2025-12-06T09:46:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:46:23 localhost podman[197801]: @ - - [06/Dec/2025:09:46:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 139150 "" "Go-http-client/1.1" Dec 6 04:46:23 localhost podman[197801]: @ - - [06/Dec/2025:09:46:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 14555 "" "Go-http-client/1.1" Dec 6 04:46:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17710 DF PROTO=TCP SPT=51408 DPT=9102 SEQ=3579882559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC6D8A0000000001030307) Dec 6 04:46:24 localhost python3.9[208159]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:24 localhost python3.9[208245]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014383.7291486-149-61401879359437/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17711 DF PROTO=TCP SPT=51408 DPT=9102 SEQ=3579882559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC71870000000001030307) Dec 6 04:46:25 localhost nova_compute[187174]: 2025-12-06 09:46:25.079 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:25 localhost python3.9[208353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41520 DF PROTO=TCP SPT=34822 DPT=9102 SEQ=2988056478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC73880000000001030307) Dec 6 04:46:25 localhost python3.9[208439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014384.939738-149-187992437376350/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:26 localhost python3.9[208547]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:26 localhost python3.9[208633]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014386.0801318-149-185117053271878/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=d9fe6c1049fd821a1212c28f12e655257aaf9f22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17712 DF PROTO=TCP SPT=51408 DPT=9102 SEQ=3579882559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC79870000000001030307) Dec 6 04:46:27 localhost nova_compute[187174]: 2025-12-06 09:46:27.358 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7242 DF PROTO=TCP SPT=44230 DPT=9102 SEQ=4291546837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC7D870000000001030307) Dec 6 04:46:28 localhost python3.9[208741]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:28 localhost python3.9[208827]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014387.8182106-323-216984727495671/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=7b95db33f224192f8ca0203814c2ca32d6e3e2c0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:29 localhost python3.9[208935]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:46:30 localhost nova_compute[187174]: 2025-12-06 09:46:30.110 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:30 localhost python3.9[209047]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:31 localhost python3.9[209157]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17713 DF PROTO=TCP SPT=51408 DPT=9102 SEQ=3579882559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBC89480000000001030307) Dec 6 04:46:31 localhost python3.9[209214]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:46:32 localhost systemd[1]: tmp-crun.paRq6k.mount: Deactivated successfully. Dec 6 04:46:32 localhost podman[209324]: 2025-12-06 09:46:32.129036013 +0000 UTC m=+0.100776728 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:46:32 localhost podman[209324]: 2025-12-06 09:46:32.170148213 +0000 UTC m=+0.141888918 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 04:46:32 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:46:32 localhost python3.9[209325]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:32 localhost nova_compute[187174]: 2025-12-06 09:46:32.410 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:33 localhost python3.9[209405]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:34 localhost python3.9[209515]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:35 localhost nova_compute[187174]: 2025-12-06 09:46:35.113 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:35 localhost python3.9[209625]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:36 localhost python3.9[209682]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:36 localhost sshd[209683]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:46:36 localhost python3.9[209794]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:46:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:46:37 localhost podman[209853]: 2025-12-06 09:46:37.063723312 +0000 UTC m=+0.079628346 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2) Dec 6 04:46:37 localhost podman[209853]: 2025-12-06 09:46:37.075938191 +0000 UTC m=+0.091843255 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:46:37 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:46:37 localhost systemd[1]: tmp-crun.rFMIkQ.mount: Deactivated successfully. Dec 6 04:46:37 localhost podman[209852]: 2025-12-06 09:46:37.129442315 +0000 UTC m=+0.146323687 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:46:37 localhost podman[209852]: 2025-12-06 09:46:37.138047009 +0000 UTC m=+0.154928401 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:46:37 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:46:37 localhost python3.9[209851]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:37 localhost nova_compute[187174]: 2025-12-06 09:46:37.449 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:38 localhost python3.9[210000]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:46:38 localhost systemd[1]: Reloading. Dec 6 04:46:38 localhost systemd-rc-local-generator[210026]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:46:38 localhost systemd-sysv-generator[210029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17714 DF PROTO=TCP SPT=51408 DPT=9102 SEQ=3579882559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBCA9870000000001030307) Dec 6 04:46:39 localhost python3.9[210147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:39 localhost python3.9[210204]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:40 localhost nova_compute[187174]: 2025-12-06 09:46:40.153 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:40 localhost python3.9[210314]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:46:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:46:40 localhost podman[210373]: 2025-12-06 09:46:40.985332712 +0000 UTC m=+0.097637713 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 04:46:40 localhost podman[210373]: 2025-12-06 09:46:40.999335594 +0000 UTC m=+0.111640605 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:46:41 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:46:41 localhost podman[210371]: 2025-12-06 09:46:41.074054304 +0000 UTC m=+0.186372696 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Dec 6 04:46:41 localhost python3.9[210372]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:41 localhost podman[210371]: 2025-12-06 09:46:41.108390524 +0000 UTC m=+0.220708896 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:46:41 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:46:42 localhost python3.9[210515]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:46:42 localhost systemd[1]: Reloading. Dec 6 04:46:42 localhost systemd-rc-local-generator[210539]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:46:42 localhost systemd-sysv-generator[210545]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:46:42 localhost nova_compute[187174]: 2025-12-06 09:46:42.477 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:42 localhost systemd[1]: Starting Create netns directory... Dec 6 04:46:42 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 04:46:42 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 04:46:42 localhost systemd[1]: Finished Create netns directory. Dec 6 04:46:43 localhost python3.9[210667]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:46:45 localhost python3.9[210777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:46:45 localhost nova_compute[187174]: 2025-12-06 09:46:45.182 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:45 localhost python3.9[210865]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014404.5873737-734-232755142419025/.source.json _original_basename=.05rupiqx follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:46 localhost openstack_network_exporter[199751]: ERROR 09:46:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:46:46 localhost openstack_network_exporter[199751]: ERROR 09:46:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:46:46 localhost openstack_network_exporter[199751]: ERROR 09:46:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:46:46 localhost openstack_network_exporter[199751]: ERROR 09:46:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:46:46 localhost openstack_network_exporter[199751]: Dec 6 04:46:46 localhost openstack_network_exporter[199751]: ERROR 09:46:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:46:46 localhost openstack_network_exporter[199751]: Dec 6 04:46:46 localhost python3.9[210975]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:47 localhost nova_compute[187174]: 2025-12-06 09:46:47.507 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:49 localhost python3.9[211283]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False Dec 6 04:46:49 localhost python3.9[211393]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:46:50 localhost nova_compute[187174]: 2025-12-06 09:46:50.296 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:46:50 localhost podman[211504]: 2025-12-06 09:46:50.776124807 +0000 UTC m=+0.092394580 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6) Dec 6 04:46:50 localhost podman[211504]: 2025-12-06 09:46:50.796282421 +0000 UTC m=+0.112552214 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 04:46:50 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:46:50 localhost python3.9[211503]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 6 04:46:52 localhost nova_compute[187174]: 2025-12-06 09:46:52.546 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:53 localhost podman[197801]: time="2025-12-06T09:46:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:46:53 localhost podman[197801]: @ - - [06/Dec/2025:09:46:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 139150 "" "Go-http-client/1.1" Dec 6 04:46:53 localhost podman[197801]: @ - - [06/Dec/2025:09:46:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 14566 "" "Go-http-client/1.1" Dec 6 04:46:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:46:53 localhost podman[211567]: 2025-12-06 09:46:53.544762834 +0000 UTC m=+0.076502313 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:46:53 localhost podman[211567]: 2025-12-06 09:46:53.55414171 +0000 UTC m=+0.085881129 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:46:53 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:46:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8849 DF PROTO=TCP SPT=58434 DPT=9102 SEQ=1701162012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBCE2B90000000001030307) Dec 6 04:46:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8850 DF PROTO=TCP SPT=58434 DPT=9102 SEQ=1701162012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBCE6C70000000001030307) Dec 6 04:46:55 localhost nova_compute[187174]: 2025-12-06 09:46:55.352 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:55 localhost python3[211681]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:46:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17715 DF PROTO=TCP SPT=51408 DPT=9102 SEQ=3579882559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBCE9880000000001030307) Dec 6 04:46:55 localhost podman[211718]: Dec 6 04:46:55 localhost podman[211718]: 2025-12-06 09:46:55.991750874 +0000 UTC m=+0.081530710 container create ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, tcib_managed=true) Dec 6 04:46:55 localhost podman[211718]: 2025-12-06 09:46:55.946686428 +0000 UTC m=+0.036466314 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Dec 6 04:46:55 localhost python3[211681]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Dec 6 04:46:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8851 DF PROTO=TCP SPT=58434 DPT=9102 SEQ=1701162012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBCEEC70000000001030307) Dec 6 04:46:57 localhost python3.9[211865]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:46:57 localhost nova_compute[187174]: 2025-12-06 09:46:57.548 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:46:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41521 DF PROTO=TCP SPT=34822 DPT=9102 SEQ=2988056478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBCF1880000000001030307) Dec 6 04:46:58 localhost python3.9[211977]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:46:58 localhost python3.9[212032]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:46:59 localhost python3.9[212141]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014418.8300903-998-255563982604469/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:00 localhost python3.9[212196]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:47:00 localhost systemd[1]: Reloading. Dec 6 04:47:00 localhost systemd-rc-local-generator[212221]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:47:00 localhost systemd-sysv-generator[212226]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:00 localhost nova_compute[187174]: 2025-12-06 09:47:00.353 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:00 localhost python3.9[212287]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:47:01 localhost systemd[1]: Reloading. Dec 6 04:47:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8852 DF PROTO=TCP SPT=58434 DPT=9102 SEQ=1701162012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBCFE870000000001030307) Dec 6 04:47:01 localhost systemd-sysv-generator[212316]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:47:01 localhost systemd-rc-local-generator[212311]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:01 localhost systemd[1]: Starting neutron_sriov_agent container... Dec 6 04:47:01 localhost systemd[1]: Started libcrun container. Dec 6 04:47:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7dc5dac922793485f20f5d826840acbe2b33df00fac57fb59cf065c3595d034/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 6 04:47:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7dc5dac922793485f20f5d826840acbe2b33df00fac57fb59cf065c3595d034/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 04:47:01 localhost podman[212328]: 2025-12-06 09:47:01.461936094 +0000 UTC m=+0.117964232 container init ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 04:47:01 localhost podman[212328]: 2025-12-06 09:47:01.471285979 +0000 UTC m=+0.127314117 container start ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251125, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:47:01 localhost podman[212328]: neutron_sriov_agent Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + sudo -E kolla_set_configs Dec 6 04:47:01 localhost systemd[1]: Started neutron_sriov_agent container. Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Validating config file Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Copying service configuration files Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Writing out command to execute Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/20509a6a-c438-4c5e-82a7-fe0ea272b309.conf Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: ++ cat /run_command Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + CMD=/usr/bin/neutron-sriov-nic-agent Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + ARGS= Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + sudo kolla_copy_cacerts Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + [[ ! -n '' ]] Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + . kolla_extend_start Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: Running command: '/usr/bin/neutron-sriov-nic-agent' Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + umask 0022 Dec 6 04:47:01 localhost neutron_sriov_agent[212343]: + exec /usr/bin/neutron-sriov-nic-agent Dec 6 04:47:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:47:02 localhost nova_compute[187174]: 2025-12-06 09:47:02.551 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:02 localhost systemd[1]: tmp-crun.ykATin.mount: Deactivated successfully. Dec 6 04:47:02 localhost podman[212445]: 2025-12-06 09:47:02.575982604 +0000 UTC m=+0.104584950 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:47:02 localhost podman[212445]: 2025-12-06 09:47:02.640306877 +0000 UTC m=+0.168909203 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:47:02 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:47:02 localhost python3.9[212486]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:47:02 localhost systemd[1]: Stopping neutron_sriov_agent container... Dec 6 04:47:03 localhost systemd[1]: libpod-ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70.scope: Deactivated successfully. Dec 6 04:47:03 localhost systemd[1]: libpod-ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70.scope: Consumed 1.515s CPU time. Dec 6 04:47:03 localhost podman[212496]: 2025-12-06 09:47:03.00839767 +0000 UTC m=+0.075984747 container died ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:47:03 localhost podman[212496]: 2025-12-06 09:47:03.06069138 +0000 UTC m=+0.128278457 container cleanup ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:47:03 localhost podman[212496]: neutron_sriov_agent Dec 6 04:47:03 localhost podman[212509]: 2025-12-06 09:47:03.063485062 +0000 UTC m=+0.053886918 container cleanup ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}) Dec 6 04:47:03 localhost podman[212523]: 2025-12-06 09:47:03.136190542 +0000 UTC m=+0.048277303 container cleanup ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 04:47:03 localhost podman[212523]: neutron_sriov_agent Dec 6 04:47:03 localhost systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully. Dec 6 04:47:03 localhost systemd[1]: Stopped neutron_sriov_agent container. Dec 6 04:47:03 localhost systemd[1]: Starting neutron_sriov_agent container... Dec 6 04:47:03 localhost systemd[1]: Started libcrun container. Dec 6 04:47:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7dc5dac922793485f20f5d826840acbe2b33df00fac57fb59cf065c3595d034/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 6 04:47:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7dc5dac922793485f20f5d826840acbe2b33df00fac57fb59cf065c3595d034/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 04:47:03 localhost podman[212535]: 2025-12-06 09:47:03.279148429 +0000 UTC m=+0.112724209 container init ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:47:03 localhost podman[212535]: 2025-12-06 09:47:03.287699761 +0000 UTC m=+0.121275551 container start ae1657c55594a623fdaa5ea19f6111124908f336aab09a73d51011d326c42d70 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '35d11cefa081e075bfc5c0e5746b241dda7109baa7fbb0d47b022122e0637b29'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:47:03 localhost podman[212535]: neutron_sriov_agent Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + sudo -E kolla_set_configs Dec 6 04:47:03 localhost systemd[1]: Started neutron_sriov_agent container. Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Validating config file Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Copying service configuration files Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Writing out command to execute Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06 Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/20509a6a-c438-4c5e-82a7-fe0ea272b309.conf Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: ++ cat /run_command Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + CMD=/usr/bin/neutron-sriov-nic-agent Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + ARGS= Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + sudo kolla_copy_cacerts Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + [[ ! -n '' ]] Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + . kolla_extend_start Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: Running command: '/usr/bin/neutron-sriov-nic-agent' Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + umask 0022 Dec 6 04:47:03 localhost neutron_sriov_agent[212548]: + exec /usr/bin/neutron-sriov-nic-agent Dec 6 04:47:04 localhost systemd[1]: session-43.scope: Deactivated successfully. Dec 6 04:47:04 localhost systemd[1]: session-43.scope: Consumed 23.552s CPU time. Dec 6 04:47:04 localhost systemd-logind[760]: Session 43 logged out. Waiting for processes to exit. Dec 6 04:47:04 localhost systemd-logind[760]: Removed session 43. Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.940 2 INFO neutron.common.config [-] Logging enabled!#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.940 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.941 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.941 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.941 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.941 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.941 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005548798.ooo.test'}#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-6935a6ba-5b79-4fad-9dea-1b28a15e75a1 - - - - - -] RPC agent_id: nic-switch-agent.np0005548798.ooo.test#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.946 2 INFO neutron.agent.agent_extensions_manager [None req-6935a6ba-5b79-4fad-9dea-1b28a15e75a1 - - - - - -] Loaded agent extensions: ['qos']#033[00m Dec 6 04:47:04 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:04.946 2 INFO neutron.agent.agent_extensions_manager [None req-6935a6ba-5b79-4fad-9dea-1b28a15e75a1 - - - - - -] Initializing agent extension 'qos'#033[00m Dec 6 04:47:05 localhost nova_compute[187174]: 2025-12-06 09:47:05.374 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:05 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:05.541 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-6935a6ba-5b79-4fad-9dea-1b28a15e75a1 - - - - - -] Agent initialized successfully, now running... #033[00m Dec 6 04:47:05 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:05.542 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-6935a6ba-5b79-4fad-9dea-1b28a15e75a1 - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Dec 6 04:47:05 localhost neutron_sriov_agent[212548]: 2025-12-06 09:47:05.543 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-6935a6ba-5b79-4fad-9dea-1b28a15e75a1 - - - - - -] Agent out of sync with plugin!#033[00m Dec 6 04:47:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:47:06.667 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:47:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:47:06.667 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:47:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:47:06.669 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:47:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:47:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:47:07 localhost nova_compute[187174]: 2025-12-06 09:47:07.554 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:07 localhost podman[212581]: 2025-12-06 09:47:07.558733176 +0000 UTC m=+0.090862865 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:47:07 localhost podman[212582]: 2025-12-06 09:47:07.539591003 +0000 UTC m=+0.072669379 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd) Dec 6 04:47:07 localhost podman[212581]: 2025-12-06 09:47:07.596445647 +0000 UTC m=+0.128575376 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:47:07 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:47:07 localhost podman[212582]: 2025-12-06 09:47:07.625326246 +0000 UTC m=+0.158404622 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 04:47:07 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:47:09 localhost sshd[212622]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:47:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8853 DF PROTO=TCP SPT=58434 DPT=9102 SEQ=1701162012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD1F870000000001030307) Dec 6 04:47:10 localhost sshd[212624]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:47:10 localhost systemd-logind[760]: New session 44 of user zuul. Dec 6 04:47:10 localhost systemd[1]: Started Session 44 of User zuul. Dec 6 04:47:10 localhost nova_compute[187174]: 2025-12-06 09:47:10.382 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:10 localhost nova_compute[187174]: 2025-12-06 09:47:10.876 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:10 localhost nova_compute[187174]: 2025-12-06 09:47:10.877 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:10 localhost nova_compute[187174]: 2025-12-06 09:47:10.944 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:10 localhost nova_compute[187174]: 2025-12-06 09:47:10.944 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:10 localhost nova_compute[187174]: 2025-12-06 09:47:10.945 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:47:11 localhost python3.9[212735]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:47:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:47:11 localhost podman[212737]: 2025-12-06 09:47:11.56774479 +0000 UTC m=+0.089097003 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:47:11 localhost podman[212737]: 2025-12-06 09:47:11.607218942 +0000 UTC m=+0.128571175 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:47:11 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:47:11 localhost podman[212736]: 2025-12-06 09:47:11.608200091 +0000 UTC m=+0.125510895 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:47:11 localhost podman[212736]: 2025-12-06 09:47:11.692134522 +0000 UTC m=+0.209445306 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 04:47:11 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:47:11 localhost nova_compute[187174]: 2025-12-06 09:47:11.874 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:11 localhost nova_compute[187174]: 2025-12-06 09:47:11.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:47:11 localhost nova_compute[187174]: 2025-12-06 09:47:11.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:47:12 localhost nova_compute[187174]: 2025-12-06 09:47:12.344 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:47:12 localhost nova_compute[187174]: 2025-12-06 09:47:12.344 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:47:12 localhost nova_compute[187174]: 2025-12-06 09:47:12.345 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:47:12 localhost nova_compute[187174]: 2025-12-06 09:47:12.345 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:47:12 localhost nova_compute[187174]: 2025-12-06 09:47:12.556 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:12 localhost python3.9[212890]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.547 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.608 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.608 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.609 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.609 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.628 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.629 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.629 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.629 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.693 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.782 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.783 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:47:13 localhost python3.9[212954]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.838 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.839 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.910 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.911 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:47:13 localhost nova_compute[187174]: 2025-12-06 09:47:13.979 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.181 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.183 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12721MB free_disk=387.3093376159668GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.183 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.183 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.260 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.260 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.261 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.299 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.315 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.318 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.318 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.584 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:14 localhost nova_compute[187174]: 2025-12-06 09:47:14.584 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:47:15 localhost nova_compute[187174]: 2025-12-06 09:47:15.418 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:16 localhost openstack_network_exporter[199751]: ERROR 09:47:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:47:16 localhost openstack_network_exporter[199751]: ERROR 09:47:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:47:16 localhost openstack_network_exporter[199751]: ERROR 09:47:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:47:16 localhost openstack_network_exporter[199751]: ERROR 09:47:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:47:16 localhost openstack_network_exporter[199751]: Dec 6 04:47:16 localhost openstack_network_exporter[199751]: ERROR 09:47:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:47:16 localhost openstack_network_exporter[199751]: Dec 6 04:47:16 localhost nova_compute[187174]: 2025-12-06 09:47:16.876 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:47:17 localhost nova_compute[187174]: 2025-12-06 09:47:17.560 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:18 localhost python3.9[213078]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 6 04:47:19 localhost python3.9[213191]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:20 localhost python3.9[213301]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:20 localhost nova_compute[187174]: 2025-12-06 09:47:20.447 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:20 localhost python3.9[213411]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:47:21 localhost podman[213521]: 2025-12-06 09:47:21.289726079 +0000 UTC m=+0.088186067 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9) Dec 6 04:47:21 localhost podman[213521]: 2025-12-06 09:47:21.302781633 +0000 UTC m=+0.101241601 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350) Dec 6 04:47:21 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:47:21 localhost python3.9[213522]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:22 localhost python3.9[213650]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:22 localhost nova_compute[187174]: 2025-12-06 09:47:22.562 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:22 localhost python3.9[213760]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:23 localhost podman[197801]: time="2025-12-06T09:47:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:47:23 localhost podman[197801]: @ - - [06/Dec/2025:09:47:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141108 "" "Go-http-client/1.1" Dec 6 04:47:23 localhost podman[197801]: @ - - [06/Dec/2025:09:47:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15001 "" "Go-http-client/1.1" Dec 6 04:47:23 localhost python3.9[213870]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16613 DF PROTO=TCP SPT=52730 DPT=9102 SEQ=4266058691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD57E90000000001030307) Dec 6 04:47:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:47:24 localhost podman[213980]: 2025-12-06 09:47:24.112274293 +0000 UTC m=+0.079178631 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:47:24 localhost podman[213980]: 2025-12-06 09:47:24.118636131 +0000 UTC m=+0.085540489 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:47:24 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:47:24 localhost python3.9[213981]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:24 localhost python3.9[214091]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014443.6074688-278-180433149384116/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16614 DF PROTO=TCP SPT=52730 DPT=9102 SEQ=4266058691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD5C070000000001030307) Dec 6 04:47:25 localhost nova_compute[187174]: 2025-12-06 09:47:25.491 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:25 localhost python3.9[214199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8854 DF PROTO=TCP SPT=58434 DPT=9102 SEQ=1701162012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD5F870000000001030307) Dec 6 04:47:26 localhost python3.9[214285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014445.0821896-323-85159268796221/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:26 localhost python3.9[214393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:27 localhost python3.9[214479]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014446.139998-323-226342872872750/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16615 DF PROTO=TCP SPT=52730 DPT=9102 SEQ=4266058691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD64070000000001030307) Dec 6 04:47:27 localhost python3.9[214587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:27 localhost nova_compute[187174]: 2025-12-06 09:47:27.564 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17716 DF PROTO=TCP SPT=51408 DPT=9102 SEQ=3579882559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD67870000000001030307) Dec 6 04:47:28 localhost python3.9[214673]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014447.126096-323-117969583784290/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=dbfa08af013f0404f617cea8af79f95e53475467 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:29 localhost python3.9[214781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:30 localhost nova_compute[187174]: 2025-12-06 09:47:30.492 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:30 localhost python3.9[214867]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014449.2702494-497-178925464499753/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=7b95db33f224192f8ca0203814c2ca32d6e3e2c0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16616 DF PROTO=TCP SPT=52730 DPT=9102 SEQ=4266058691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD73C70000000001030307) Dec 6 04:47:31 localhost python3.9[214975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:31 localhost python3.9[215061]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014450.9069638-542-205166071764255/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:32 localhost python3.9[215169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:32 localhost nova_compute[187174]: 2025-12-06 09:47:32.759 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:47:33 localhost python3.9[215255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014451.9161105-542-264337445269193/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:33 localhost podman[215256]: 2025-12-06 09:47:33.231016682 +0000 UTC m=+0.073753693 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:47:33 localhost podman[215256]: 2025-12-06 09:47:33.261199412 +0000 UTC m=+0.103936453 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:47:33 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:47:33 localhost python3.9[215388]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:34 localhost python3.9[215443]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:34 localhost python3.9[215551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:35 localhost python3.9[215637]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014454.1937473-629-38183613558066/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:35 localhost nova_compute[187174]: 2025-12-06 09:47:35.525 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:35 localhost python3.9[215745]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:47:36 localhost python3.9[215857]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:37 localhost python3.9[215967]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:37 localhost python3.9[216024]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:37 localhost nova_compute[187174]: 2025-12-06 09:47:37.761 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:47:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:47:38 localhost podman[216136]: 2025-12-06 09:47:38.019372647 +0000 UTC m=+0.081743276 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 04:47:38 localhost podman[216136]: 2025-12-06 09:47:38.033206945 +0000 UTC m=+0.095577554 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 04:47:38 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:47:38 localhost python3.9[216134]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:38 localhost podman[216135]: 2025-12-06 09:47:38.122684399 +0000 UTC m=+0.186210733 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:47:38 localhost podman[216135]: 2025-12-06 09:47:38.158334393 +0000 UTC m=+0.221860707 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:47:38 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:47:38 localhost python3.9[216233]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:39 localhost python3.9[216343]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16617 DF PROTO=TCP SPT=52730 DPT=9102 SEQ=4266058691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBD93870000000001030307) Dec 6 04:47:40 localhost python3.9[216453]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:40 localhost nova_compute[187174]: 2025-12-06 09:47:40.565 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:40 localhost python3.9[216510]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:41 localhost python3.9[216620]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:47:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:47:42 localhost podman[216678]: 2025-12-06 09:47:42.293335612 +0000 UTC m=+0.101628941 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:47:42 localhost podman[216679]: 2025-12-06 09:47:42.33422959 +0000 UTC m=+0.146760559 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Dec 6 04:47:42 localhost podman[216678]: 2025-12-06 09:47:42.352877499 +0000 UTC m=+0.161170838 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:47:42 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:47:42 localhost podman[216679]: 2025-12-06 09:47:42.372246188 +0000 UTC m=+0.184777167 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 6 04:47:42 localhost python3.9[216677]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:42 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:47:42 localhost nova_compute[187174]: 2025-12-06 09:47:42.763 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:43 localhost python3.9[216822]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:47:43 localhost systemd[1]: Reloading. Dec 6 04:47:43 localhost systemd-sysv-generator[216851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:47:43 localhost systemd-rc-local-generator[216846]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:44 localhost python3.9[216969]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:44 localhost python3.9[217026]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:45 localhost python3.9[217136]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:45 localhost nova_compute[187174]: 2025-12-06 09:47:45.590 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:46 localhost python3.9[217193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:46 localhost openstack_network_exporter[199751]: ERROR 09:47:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:47:46 localhost openstack_network_exporter[199751]: ERROR 09:47:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:47:46 localhost openstack_network_exporter[199751]: ERROR 09:47:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:47:46 localhost openstack_network_exporter[199751]: ERROR 09:47:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:47:46 localhost openstack_network_exporter[199751]: Dec 6 04:47:46 localhost openstack_network_exporter[199751]: ERROR 09:47:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:47:46 localhost openstack_network_exporter[199751]: Dec 6 04:47:46 localhost python3.9[217303]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:47:46 localhost systemd[1]: Reloading. Dec 6 04:47:46 localhost systemd-rc-local-generator[217326]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:47:46 localhost systemd-sysv-generator[217331]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:47:47 localhost systemd[1]: Starting Create netns directory... Dec 6 04:47:47 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 04:47:47 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 04:47:47 localhost systemd[1]: Finished Create netns directory. Dec 6 04:47:47 localhost nova_compute[187174]: 2025-12-06 09:47:47.765 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:48 localhost python3.9[217455]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:47:48 localhost python3.9[217565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:47:49 localhost python3.9[217653]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014468.3382938-1073-53333838397080/.source.json _original_basename=.x4md2yjp follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:50 localhost python3.9[217763]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:47:50 localhost nova_compute[187174]: 2025-12-06 09:47:50.590 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:47:51 localhost podman[217962]: 2025-12-06 09:47:51.567482936 +0000 UTC m=+0.095447559 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.) Dec 6 04:47:51 localhost podman[217962]: 2025-12-06 09:47:51.588148256 +0000 UTC m=+0.116112919 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 04:47:51 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:47:52 localhost nova_compute[187174]: 2025-12-06 09:47:52.767 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:52 localhost python3.9[218091]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Dec 6 04:47:53 localhost podman[197801]: time="2025-12-06T09:47:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:47:53 localhost podman[197801]: @ - - [06/Dec/2025:09:47:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141108 "" "Go-http-client/1.1" Dec 6 04:47:53 localhost podman[197801]: @ - - [06/Dec/2025:09:47:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15007 "" "Go-http-client/1.1" Dec 6 04:47:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64486 DF PROTO=TCP SPT=40292 DPT=9102 SEQ=2652114640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBDCD190000000001030307) Dec 6 04:47:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:47:54 localhost podman[218201]: 2025-12-06 09:47:54.536457163 +0000 UTC m=+0.095330466 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:47:54 localhost podman[218201]: 2025-12-06 09:47:54.541819569 +0000 UTC m=+0.100692912 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:47:54 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:47:54 localhost python3.9[218202]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:47:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64487 DF PROTO=TCP SPT=40292 DPT=9102 SEQ=2652114640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBDD1070000000001030307) Dec 6 04:47:55 localhost python3.9[218334]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 6 04:47:55 localhost nova_compute[187174]: 2025-12-06 09:47:55.618 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16618 DF PROTO=TCP SPT=52730 DPT=9102 SEQ=4266058691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBDD3870000000001030307) Dec 6 04:47:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64488 DF PROTO=TCP SPT=40292 DPT=9102 SEQ=2652114640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBDD9070000000001030307) Dec 6 04:47:57 localhost nova_compute[187174]: 2025-12-06 09:47:57.770 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:47:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8855 DF PROTO=TCP SPT=58434 DPT=9102 SEQ=1701162012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBDDD870000000001030307) Dec 6 04:47:59 localhost python3[218471]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:48:00 localhost podman[218508]: Dec 6 04:48:00 localhost podman[218508]: 2025-12-06 09:48:00.122634423 +0000 UTC m=+0.086695278 container create 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:48:00 localhost podman[218508]: 2025-12-06 09:48:00.080771306 +0000 UTC m=+0.044832241 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 04:48:00 localhost python3[218471]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 04:48:00 localhost nova_compute[187174]: 2025-12-06 09:48:00.621 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:01 localhost python3.9[218654]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:48:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64489 DF PROTO=TCP SPT=40292 DPT=9102 SEQ=2652114640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBDE8C70000000001030307) Dec 6 04:48:02 localhost python3.9[218766]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:48:02 localhost nova_compute[187174]: 2025-12-06 09:48:02.772 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:02 localhost python3.9[218821]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:48:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:48:03 localhost systemd[1]: tmp-crun.UjQrmO.mount: Deactivated successfully. Dec 6 04:48:03 localhost podman[218931]: 2025-12-06 09:48:03.418793251 +0000 UTC m=+0.083993714 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:48:03 localhost podman[218931]: 2025-12-06 09:48:03.457193492 +0000 UTC m=+0.122393925 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 04:48:03 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:48:03 localhost python3.9[218930]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014482.9022582-1337-180441005056490/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:48:04 localhost python3.9[219009]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:48:04 localhost systemd[1]: Reloading. Dec 6 04:48:04 localhost systemd-rc-local-generator[219032]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:48:04 localhost systemd-sysv-generator[219039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:04 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost python3.9[219100]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:48:05 localhost systemd[1]: Reloading. Dec 6 04:48:05 localhost nova_compute[187174]: 2025-12-06 09:48:05.624 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:05 localhost systemd-sysv-generator[219131]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:48:05 localhost systemd-rc-local-generator[219120]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:48:05 localhost systemd[1]: Starting neutron_dhcp_agent container... Dec 6 04:48:06 localhost systemd[1]: Started libcrun container. Dec 6 04:48:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee720e7be4abdc1a9a3d4067e29a3895bc8bddf2de0b1cbb8d146d46d32c9db4/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 6 04:48:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee720e7be4abdc1a9a3d4067e29a3895bc8bddf2de0b1cbb8d146d46d32c9db4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 04:48:06 localhost podman[219141]: 2025-12-06 09:48:06.104487688 +0000 UTC m=+0.160901328 container init 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + sudo -E kolla_set_configs Dec 6 04:48:06 localhost podman[219141]: 2025-12-06 09:48:06.128972247 +0000 UTC m=+0.185385887 container start 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:48:06 localhost podman[219141]: neutron_dhcp_agent Dec 6 04:48:06 localhost systemd[1]: Started neutron_dhcp_agent container. Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Validating config file Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Copying service configuration files Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Writing out command to execute Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06 Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/20509a6a-c438-4c5e-82a7-fe0ea272b309.conf Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: ++ cat /run_command Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + CMD=/usr/bin/neutron-dhcp-agent Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + ARGS= Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + sudo kolla_copy_cacerts Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + [[ ! -n '' ]] Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + . kolla_extend_start Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: Running command: '/usr/bin/neutron-dhcp-agent' Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + umask 0022 Dec 6 04:48:06 localhost neutron_dhcp_agent[219156]: + exec /usr/bin/neutron-dhcp-agent Dec 6 04:48:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:48:06.668 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:48:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:48:06.668 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:48:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:48:06.670 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:48:07 localhost neutron_dhcp_agent[219156]: 2025-12-06 09:48:07.470 219160 INFO neutron.common.config [-] Logging enabled!#033[00m Dec 6 04:48:07 localhost neutron_dhcp_agent[219156]: 2025-12-06 09:48:07.470 219160 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Dec 6 04:48:07 localhost nova_compute[187174]: 2025-12-06 09:48:07.774 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:07 localhost neutron_dhcp_agent[219156]: 2025-12-06 09:48:07.880 219160 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Dec 6 04:48:07 localhost python3.9[219280]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:48:07 localhost systemd[1]: Stopping neutron_dhcp_agent container... Dec 6 04:48:08 localhost systemd[1]: libpod-83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c.scope: Deactivated successfully. Dec 6 04:48:08 localhost podman[219285]: 2025-12-06 09:48:08.383191729 +0000 UTC m=+0.416999336 container stop 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, config_id=neutron_dhcp) Dec 6 04:48:08 localhost systemd[1]: libpod-83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c.scope: Consumed 2.150s CPU time. Dec 6 04:48:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:48:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:48:08 localhost podman[219285]: 2025-12-06 09:48:08.420121064 +0000 UTC m=+0.453928701 container died 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, io.buildah.version=1.41.3, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:48:08 localhost podman[219305]: 2025-12-06 09:48:08.479471604 +0000 UTC m=+0.072618012 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd) Dec 6 04:48:08 localhost podman[219305]: 2025-12-06 09:48:08.48903148 +0000 UTC m=+0.082177888 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible) Dec 6 04:48:08 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:48:08 localhost podman[219301]: 2025-12-06 09:48:08.551687772 +0000 UTC m=+0.148077061 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:48:08 localhost podman[219301]: 2025-12-06 09:48:08.561670661 +0000 UTC m=+0.158059980 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:48:08 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:48:08 localhost podman[219285]: 2025-12-06 09:48:08.625899672 +0000 UTC m=+0.659707309 container cleanup 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, config_id=neutron_dhcp, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:48:08 localhost podman[219285]: neutron_dhcp_agent Dec 6 04:48:08 localhost podman[219363]: error opening file `/run/crun/83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c/status`: No such file or directory Dec 6 04:48:08 localhost podman[219350]: 2025-12-06 09:48:08.725788779 +0000 UTC m=+0.072981203 container cleanup 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:48:08 localhost podman[219350]: neutron_dhcp_agent Dec 6 04:48:08 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Dec 6 04:48:08 localhost systemd[1]: Stopped neutron_dhcp_agent container. Dec 6 04:48:08 localhost systemd[1]: Starting neutron_dhcp_agent container... Dec 6 04:48:08 localhost systemd[1]: Started libcrun container. Dec 6 04:48:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee720e7be4abdc1a9a3d4067e29a3895bc8bddf2de0b1cbb8d146d46d32c9db4/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 6 04:48:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee720e7be4abdc1a9a3d4067e29a3895bc8bddf2de0b1cbb8d146d46d32c9db4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 04:48:08 localhost nova_compute[187174]: 2025-12-06 09:48:08.871 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:08 localhost podman[219365]: 2025-12-06 09:48:08.875073446 +0000 UTC m=+0.118882806 container init 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3) Dec 6 04:48:08 localhost podman[219365]: 2025-12-06 09:48:08.884111196 +0000 UTC m=+0.127920546 container start 83f014743c47f0af25c3cc23b835851f066bb72382d20db47b8474999cdd9c9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0d1c7da660b65a2efec85b9605e5c360e6d12606683371154705f25f5b16c09f'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:48:08 localhost podman[219365]: neutron_dhcp_agent Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: + sudo -E kolla_set_configs Dec 6 04:48:08 localhost systemd[1]: Started neutron_dhcp_agent container. Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Validating config file Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Copying service configuration files Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Writing out command to execute Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06 Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/20509a6a-c438-4c5e-82a7-fe0ea272b309.conf Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: ++ cat /run_command Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: + CMD=/usr/bin/neutron-dhcp-agent Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: + ARGS= Dec 6 04:48:08 localhost neutron_dhcp_agent[219380]: + sudo kolla_copy_cacerts Dec 6 04:48:09 localhost neutron_dhcp_agent[219380]: + [[ ! -n '' ]] Dec 6 04:48:09 localhost neutron_dhcp_agent[219380]: + . kolla_extend_start Dec 6 04:48:09 localhost neutron_dhcp_agent[219380]: Running command: '/usr/bin/neutron-dhcp-agent' Dec 6 04:48:09 localhost neutron_dhcp_agent[219380]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Dec 6 04:48:09 localhost neutron_dhcp_agent[219380]: + umask 0022 Dec 6 04:48:09 localhost neutron_dhcp_agent[219380]: + exec /usr/bin/neutron-dhcp-agent Dec 6 04:48:09 localhost systemd[1]: session-44.scope: Deactivated successfully. Dec 6 04:48:09 localhost systemd[1]: session-44.scope: Consumed 33.892s CPU time. Dec 6 04:48:09 localhost systemd-logind[760]: Session 44 logged out. Waiting for processes to exit. Dec 6 04:48:09 localhost systemd-logind[760]: Removed session 44. Dec 6 04:48:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64490 DF PROTO=TCP SPT=40292 DPT=9102 SEQ=2652114640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE09880000000001030307) Dec 6 04:48:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 09:48:10.270 219384 INFO neutron.common.config [-] Logging enabled!#033[00m Dec 6 04:48:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 09:48:10.271 219384 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Dec 6 04:48:10 localhost nova_compute[187174]: 2025-12-06 09:48:10.627 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 09:48:10.655 219384 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Dec 6 04:48:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 09:48:10.927 219384 INFO neutron.agent.dhcp.agent [None req-82a21e8c-2f90-4efa-a9b9-8e125d8efd61 - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 6 04:48:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 09:48:10.928 219384 INFO neutron.agent.dhcp.agent [None req-82a21e8c-2f90-4efa-a9b9-8e125d8efd61 - - - - - -] Synchronizing state complete#033[00m Dec 6 04:48:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 09:48:11.022 219384 INFO neutron.agent.dhcp.agent [None req-82a21e8c-2f90-4efa-a9b9-8e125d8efd61 - - - - - -] DHCP agent started#033[00m Dec 6 04:48:11 localhost nova_compute[187174]: 2025-12-06 09:48:11.411 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:48:11.416 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 04:48:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:48:11.417 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 04:48:11 localhost ovn_metadata_agent[137254]: 2025-12-06 09:48:11.417 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:48:11 localhost nova_compute[187174]: 2025-12-06 09:48:11.874 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:11 localhost nova_compute[187174]: 2025-12-06 09:48:11.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:48:11 localhost nova_compute[187174]: 2025-12-06 09:48:11.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:48:12 localhost nova_compute[187174]: 2025-12-06 09:48:12.105 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:48:12 localhost nova_compute[187174]: 2025-12-06 09:48:12.106 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:48:12 localhost nova_compute[187174]: 2025-12-06 09:48:12.106 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:48:12 localhost nova_compute[187174]: 2025-12-06 09:48:12.107 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:48:12 localhost podman[219413]: 2025-12-06 09:48:12.644015458 +0000 UTC m=+0.176373508 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:48:12 localhost podman[219413]: 2025-12-06 09:48:12.672358747 +0000 UTC m=+0.204716777 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 04:48:12 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:48:12 localhost podman[219414]: 2025-12-06 09:48:12.573267685 +0000 UTC m=+0.103634933 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 04:48:12 localhost nova_compute[187174]: 2025-12-06 09:48:12.777 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:12 localhost podman[219414]: 2025-12-06 09:48:12.828414994 +0000 UTC m=+0.358782212 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 04:48:12 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.229 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.248 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.248 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.249 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.250 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.250 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.251 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:13 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.175 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=30664 DF PROTO=TCP SPT=31132 DPT=19885 SEQ=3287144923 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A4B55209E000000000103030A) Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.874 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.876 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.904 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.905 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.905 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.906 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:48:13 localhost nova_compute[187174]: 2025-12-06 09:48:13.976 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.048 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.050 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.120 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.121 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.189 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.191 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.247 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:48:14 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.175 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=30665 DF PROTO=TCP SPT=31132 DPT=19885 SEQ=3287144923 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A4B552491000000000103030A) Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.451 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.453 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12614MB free_disk=387.30741119384766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.453 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.454 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.566 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.567 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.567 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.616 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.638 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.640 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:48:14 localhost nova_compute[187174]: 2025-12-06 09:48:14.641 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:48:14 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.175 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=958 DF PROTO=TCP SPT=55256 DPT=19885 SEQ=2085193410 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A4B5525AE000000000103030A) Dec 6 04:48:15 localhost nova_compute[187174]: 2025-12-06 09:48:15.630 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:15 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.175 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=959 DF PROTO=TCP SPT=55256 DPT=19885 SEQ=2085193410 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A4B5529D2000000000103030A) Dec 6 04:48:16 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.175 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=28088 DF PROTO=TCP SPT=55262 DPT=19885 SEQ=4208047250 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A4B552B94000000000103030A) Dec 6 04:48:16 localhost openstack_network_exporter[199751]: ERROR 09:48:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:48:16 localhost openstack_network_exporter[199751]: ERROR 09:48:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:48:16 localhost openstack_network_exporter[199751]: ERROR 09:48:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:48:16 localhost openstack_network_exporter[199751]: ERROR 09:48:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:48:16 localhost openstack_network_exporter[199751]: Dec 6 04:48:16 localhost openstack_network_exporter[199751]: ERROR 09:48:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:48:16 localhost openstack_network_exporter[199751]: Dec 6 04:48:17 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.175 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=28089 DF PROTO=TCP SPT=55262 DPT=19885 SEQ=4208047250 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A4B552F92000000000103030A) Dec 6 04:48:17 localhost nova_compute[187174]: 2025-12-06 09:48:17.780 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:19 localhost nova_compute[187174]: 2025-12-06 09:48:19.642 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:48:20 localhost nova_compute[187174]: 2025-12-06 09:48:20.632 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:48:22 localhost podman[219462]: 2025-12-06 09:48:22.591682837 +0000 UTC m=+0.121553168 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 04:48:22 localhost podman[219462]: 2025-12-06 09:48:22.603816024 +0000 UTC m=+0.133686365 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41) Dec 6 04:48:22 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:48:22 localhost nova_compute[187174]: 2025-12-06 09:48:22.783 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.984 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.988 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1905633c-97c7-4fac-9ffa-002387ec1ddb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:22.985049', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b43ee746-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': '8f30e5f9c391d857ca8a476c00fe58f4fc219e21881952fedb502b0f6721b46f'}]}, 'timestamp': '2025-12-06 09:48:22.989522', '_unique_id': '89b3620ed0924f88b0b8ddb55680a5fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.991 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.992 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c542230f-4929-4e4d-a388-c327cbc3261a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:22.992724', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b43f8070-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': 'f091b40a6b20ff003d6903aa3d1d44ea53c03a735448868661d66cb95c4da3aa'}]}, 'timestamp': '2025-12-06 09:48:22.993395', '_unique_id': '3a991e776bb04562a643523dae893b2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.994 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:22.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.045 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 947163713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.046 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 9516486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6322c512-9f1e-4566-a957-b580fc01f79e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 947163713, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:22.996054', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b447958a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': '187ddecb3b3d2fa9e79fa997082668db712fc639c2ee51acdb7dbe753acdeef4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9516486, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:22.996054', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b447ad5e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': '03c2f3c1736f62a72d96af49f63a0f27cdc3b5ad35da52beaf60ce4060ff5c59'}]}, 'timestamp': '2025-12-06 09:48:23.047070', '_unique_id': 'a6ee916e78ea494d9ba2c7cacc43d118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.048 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.051 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec364532-09ad-41ca-8f8d-872d519f44b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.050991', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b448656e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': '5f7fa8e51d5c8962c1d2ef595b5c153357599904ff157427cc72e3b50985a8cf'}]}, 'timestamp': '2025-12-06 09:48:23.051778', '_unique_id': '7255fb3cbbbf454292f3866169601179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.078 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31064064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1dcd21e5-6c3f-439b-a061-96996eb4519e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31064064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.055222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b44c971a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.265387949, 'message_signature': 'f10c1bcb917837354330956967095640346db34a5d102582b3d0137c725300de'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.055222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b44cb060-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.265387949, 'message_signature': '5a6060405b46c622cf646ca27cedc16c771fb3d505d80937e3c43c2f5631c612'}]}, 'timestamp': '2025-12-06 09:48:23.079888', '_unique_id': 'd21afcda455045c5ae28db52153bd427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.084 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19cbe166-117e-462c-a927-003c8acc6ff2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.083799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b44d65c8-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.265387949, 'message_signature': 'ebbf4f8331fd19d14c6584c9b55b8b056dc795fce913bb2abd7b4d5a5f93331f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.083799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b44d7d88-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.265387949, 'message_signature': '387b88743bb8c260010914a23f5dd77eca821aa5539b9d109e4e906ffb060330'}]}, 'timestamp': '2025-12-06 09:48:23.085105', '_unique_id': 'c9032d3a26fd41a7b913c2d9180ce925'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.088 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 100 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84907068-cb50-490b-9f69-dae5020087fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 100, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.088198', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b44e0ffa-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': '8b009d327c8d48c4b7057fb8e92b752bb48aa63f96e2b6be748bd4df1c8ebf9f'}]}, 'timestamp': '2025-12-06 09:48:23.088908', '_unique_id': '3329def48d8b4ffc838e81de35ba46c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.092 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c03aa73a-d409-4c34-95ee-b743e2094291', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.091964', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b44ea2e4-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': 'c33ed30ff0907c26f5e1dc451e9698c1cd98cf2dcb16b75475aa9fe4954951d6'}]}, 'timestamp': '2025-12-06 09:48:23.092639', '_unique_id': '4500a6a94ece4a79add3e071849019c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.095 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93f882e8-81a8-41c1-891e-fb059765567e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.095813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b44f3ae2-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.265387949, 'message_signature': 'e95584affc409118841ce5f3b054a24f8250644a98a4a1d8a5e76f070c59460b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.095813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b44f5284-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.265387949, 'message_signature': '4e82e65e659b9cd548e27aba6dbdb3668cd541da55ad120dfab41ca90dcb7106'}]}, 'timestamp': '2025-12-06 09:48:23.097125', '_unique_id': '313c4a4b89034e62ac2f9b8ca4cc276e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 301237008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 37411248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86b736b4-c992-49f0-8721-e231c39fe649', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 301237008, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.100187', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b44fe3c0-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': '002b0cdddeb417f520cc85154e8b103ff1f90698ddde0a79997ef7a2e206bc0a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37411248, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.100187', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b44ffc2a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': '847f728b6831bfc39498e54c02107196a26e6f593b48a4ec65b12e9a679162c0'}]}, 'timestamp': '2025-12-06 09:48:23.101466', '_unique_id': '678794e3c37a42d59b76fde8e7957bfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.104 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.127 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 54630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7e4953f-c4bb-4d3a-85a1-178885e48e04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54630000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:48:23.104507', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b45409fa-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.33736136, 'message_signature': '36402e04b37c9ee972df5079eba5fddfff41584d534c849d951edeb14b4eb547'}]}, 'timestamp': '2025-12-06 09:48:23.128082', '_unique_id': 'd3e5ece11d0f4d58b2657a3a77e0264e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '976ac22a-c4a4-4824-8321-572935ac5c8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.130875', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b45491c2-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': 'a87e1220950a65a878bac679467fa971587aef0208f6b0d7e3193eae1163554e'}]}, 'timestamp': '2025-12-06 09:48:23.131448', '_unique_id': '23321621775b43b9b69475f86781b931'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c22c88f-61d9-43d1-bb25-35005ee104ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.134266', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4551458-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': '7acbc160f6cd5083384399ad97d04e07eca32d62ba73a4dd4add8f561b2d3fad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.134266', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4552768-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': '1ee832c614fd39f9a7f93e5c1eaa00ad1ba0046c749ca5eb7690c8849ffbd6de'}]}, 'timestamp': '2025-12-06 09:48:23.135237', '_unique_id': 'ae4901d6e5ac4ae4ae25d42c0684a9bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.138 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.138 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18a55b97-8285-4eec-8c68-9a1c758d9095', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.137992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b455a63e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': '5481a4bcf18e4957622d4b934b9db19f069433ed493e7d64f8fd8140e86a870a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.137992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b455b840-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': 'fcc3e487bb7667dd22a064e6a0dcaae010315896199922de0c1cad1b0aeb0261'}]}, 'timestamp': '2025-12-06 09:48:23.139004', '_unique_id': '401d8eba630f483db89010eaeb68da01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.140 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.141 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.142 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66b8ccf8-f612-4f06-954f-b3ce8ab0c84a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.141526', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4562fe6-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': 'bef3addf57a5987ed080dbb1292b952591477d1956095594a181cfe8fe8aeb4b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.141526', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b45646c0-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': 'c327058b0ff1eeffbe55456481e314cf28adc83d2d80c6db1d56c8fe6358fc59'}]}, 'timestamp': '2025-12-06 09:48:23.142668', '_unique_id': '750fa9551b474ee4a1585432cd5bc945'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.145 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 10762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca01f0ac-867a-4883-be8f-d1e907e10b8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10762, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.145346', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b456c6a4-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': '071966618151704f05af4912e63574263d3aa2f01d256f36fc5695f6e4a0183a'}]}, 'timestamp': '2025-12-06 09:48:23.145983', '_unique_id': 'a9f98594dd664db195c1f20d2c7d06c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.148 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90a379ab-d6f3-430d-b32f-5a2984957696', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.148366', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b4573ada-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': '439e2817121c3f8b2c65c34f48e9b7b7bc1c07762db4ff355cab07c53abbab92'}]}, 'timestamp': '2025-12-06 09:48:23.149000', '_unique_id': '52d1b8c18d5f4ea58ad9764cd71a2398'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.150 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.151 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 73904128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.152 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ca8760d-ba42-44ae-9ce3-88ff99116237', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73904128, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:48:23.151543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b457b6c2-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': 'd6b5fc5bdcad49ee1855ab2e1ee67dd958cbea878c810d5beb79c8a6df9e41c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:48:23.151543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b457cb44-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.206193084, 'message_signature': 'df6269511734cd0b0e5b9c63897b8f6da902d7d375e37d34326718b19558b9a3'}]}, 'timestamp': '2025-12-06 09:48:23.152542', '_unique_id': '11014fb244944bd0a742431cf435c328'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.153 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.155 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 10055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4cd92d5-25f3-4004-acfb-fc7b8eab7236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10055, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.155065', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b4584042-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': 'e4ce75909ff59d49e7504c7903e14fbd8ca123419fc28bd15de91bf4d2daf92e'}]}, 'timestamp': '2025-12-06 09:48:23.155562', '_unique_id': 'eb0aa9971ef54ff2a4e392f52186f303'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.156 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.158 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 356 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e287f33-4836-4d52-8707-c69efe758181', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 356, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:48:23.157980', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'b458b220-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.195187423, 'message_signature': 'abe7819c765258a304fc608bf08ec85e008010c7b768a38437b725e298632b36'}]}, 'timestamp': '2025-12-06 09:48:23.158477', '_unique_id': '9dc56c05ef3942bebfd272ea532d71a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.159 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.160 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.160 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9e3743c-3d87-4a95-96bf-6ff325a50a0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:48:23.160770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b4592156-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 10890.33736136, 'message_signature': '17f12cac9110a7b922fa3c5b195f42ce8a607561b61d2f5d3cebd93b22ac45fb'}]}, 'timestamp': '2025-12-06 09:48:23.161304', '_unique_id': 'a7bb7d7de0b241f7ab20e033e6763cbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:48:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:48:23.162 12 ERROR oslo_messaging.notify.messaging Dec 6 04:48:23 localhost podman[197801]: time="2025-12-06T09:48:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:48:23 localhost podman[197801]: @ - - [06/Dec/2025:09:48:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143413 "" "Go-http-client/1.1" Dec 6 04:48:23 localhost podman[197801]: @ - - [06/Dec/2025:09:48:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15446 "" "Go-http-client/1.1" Dec 6 04:48:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46084 DF PROTO=TCP SPT=44686 DPT=9102 SEQ=2596043594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE42490000000001030307) Dec 6 04:48:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46085 DF PROTO=TCP SPT=44686 DPT=9102 SEQ=2596043594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE46470000000001030307) Dec 6 04:48:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:48:25 localhost podman[219483]: 2025-12-06 09:48:25.544417561 +0000 UTC m=+0.081397054 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:48:25 localhost podman[219483]: 2025-12-06 09:48:25.551072377 +0000 UTC m=+0.088051850 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:48:25 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:48:25 localhost nova_compute[187174]: 2025-12-06 09:48:25.634 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64491 DF PROTO=TCP SPT=40292 DPT=9102 SEQ=2652114640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE49870000000001030307) Dec 6 04:48:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46086 DF PROTO=TCP SPT=44686 DPT=9102 SEQ=2596043594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE4E480000000001030307) Dec 6 04:48:27 localhost nova_compute[187174]: 2025-12-06 09:48:27.785 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16619 DF PROTO=TCP SPT=52730 DPT=9102 SEQ=4266058691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE51870000000001030307) Dec 6 04:48:28 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=162.142.125.120 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=44711 DF PROTO=TCP SPT=45748 DPT=19885 SEQ=1212002510 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AF51506DB000000000103030A) Dec 6 04:48:29 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=162.142.125.120 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=44712 DF PROTO=TCP SPT=45748 DPT=19885 SEQ=1212002510 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AF5150AD4000000000103030A) Dec 6 04:48:29 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=162.142.125.120 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=20643 DF PROTO=TCP SPT=45784 DPT=19885 SEQ=2404155999 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AF5150B07000000000103030A) Dec 6 04:48:30 localhost nova_compute[187174]: 2025-12-06 09:48:30.636 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:30 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=162.142.125.120 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=10048 DF PROTO=TCP SPT=45800 DPT=19885 SEQ=2480536062 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AF5150F12000000000103030A) Dec 6 04:48:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46087 DF PROTO=TCP SPT=44686 DPT=9102 SEQ=2596043594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE5E070000000001030307) Dec 6 04:48:31 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=162.142.125.120 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=10049 DF PROTO=TCP SPT=45800 DPT=19885 SEQ=2480536062 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AF5151314000000000103030A) Dec 6 04:48:32 localhost nova_compute[187174]: 2025-12-06 09:48:32.787 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:33 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=162.142.125.120 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=52 ID=10050 DF PROTO=TCP SPT=45800 DPT=19885 SEQ=2480536062 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080AF5151B14000000000103030A) Dec 6 04:48:34 localhost sshd[219505]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:48:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:48:34 localhost podman[219507]: 2025-12-06 09:48:34.546417049 +0000 UTC m=+0.079939189 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:48:34 localhost podman[219507]: 2025-12-06 09:48:34.582386544 +0000 UTC m=+0.115908644 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 6 04:48:34 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:48:35 localhost nova_compute[187174]: 2025-12-06 09:48:35.641 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:37 localhost nova_compute[187174]: 2025-12-06 09:48:37.789 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46088 DF PROTO=TCP SPT=44686 DPT=9102 SEQ=2596043594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBE7D870000000001030307) Dec 6 04:48:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:48:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:48:39 localhost podman[219533]: 2025-12-06 09:48:39.560067794 +0000 UTC m=+0.085083999 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:48:39 localhost podman[219533]: 2025-12-06 09:48:39.603917992 +0000 UTC m=+0.128934197 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:48:39 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:48:39 localhost podman[219532]: 2025-12-06 09:48:39.609456104 +0000 UTC m=+0.138332969 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:48:39 localhost podman[219532]: 2025-12-06 09:48:39.690241868 +0000 UTC m=+0.219118683 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:48:39 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:48:40 localhost nova_compute[187174]: 2025-12-06 09:48:40.642 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:41 localhost ovn_controller[131684]: 2025-12-06T09:48:41Z|00047|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory Dec 6 04:48:41 localhost sshd[219573]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:48:41 localhost systemd-logind[760]: New session 45 of user zuul. Dec 6 04:48:41 localhost systemd[1]: Started Session 45 of User zuul. Dec 6 04:48:42 localhost nova_compute[187174]: 2025-12-06 09:48:42.792 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:42 localhost python3.9[219684]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:48:43 localhost podman[219707]: 2025-12-06 09:48:43.549271394 +0000 UTC m=+0.078829985 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:48:43 localhost podman[219707]: 2025-12-06 09:48:43.562308177 +0000 UTC m=+0.091866768 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3) Dec 6 04:48:43 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:48:43 localhost podman[219706]: 2025-12-06 09:48:43.599356485 +0000 UTC m=+0.132984962 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:48:43 localhost podman[219706]: 2025-12-06 09:48:43.607209159 +0000 UTC m=+0.140837656 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 04:48:43 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:48:44 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=206.168.34.54 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=12565 DF PROTO=TCP SPT=1448 DPT=19885 SEQ=3750572577 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8D41EA75000000000103030A) Dec 6 04:48:44 localhost python3.9[219834]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:48:44 localhost network[219851]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:48:44 localhost network[219852]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:48:44 localhost network[219853]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:48:45 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=206.168.34.54 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=12566 DF PROTO=TCP SPT=1448 DPT=19885 SEQ=3750572577 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8D41EE5E000000000103030A) Dec 6 04:48:45 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=206.168.34.54 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=26458 DF PROTO=TCP SPT=1462 DPT=19885 SEQ=1467681800 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8D41EE68000000000103030A) Dec 6 04:48:45 localhost nova_compute[187174]: 2025-12-06 09:48:45.645 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:46 localhost openstack_network_exporter[199751]: ERROR 09:48:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:48:46 localhost openstack_network_exporter[199751]: ERROR 09:48:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:48:46 localhost openstack_network_exporter[199751]: ERROR 09:48:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:48:46 localhost openstack_network_exporter[199751]: ERROR 09:48:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:48:46 localhost openstack_network_exporter[199751]: Dec 6 04:48:46 localhost openstack_network_exporter[199751]: ERROR 09:48:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:48:46 localhost openstack_network_exporter[199751]: Dec 6 04:48:46 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=206.168.34.54 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=26459 DF PROTO=TCP SPT=1462 DPT=19885 SEQ=1467681800 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8D41F25E000000000103030A) Dec 6 04:48:46 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=206.168.34.54 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=9173 DF PROTO=TCP SPT=1476 DPT=19885 SEQ=894714009 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A8D41F27E000000000103030A) Dec 6 04:48:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:48:47 localhost nova_compute[187174]: 2025-12-06 09:48:47.795 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:50 localhost python3.9[220088]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 6 04:48:50 localhost nova_compute[187174]: 2025-12-06 09:48:50.649 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:51 localhost python3.9[220151]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:48:52 localhost nova_compute[187174]: 2025-12-06 09:48:52.813 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:53 localhost podman[197801]: time="2025-12-06T09:48:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:48:53 localhost podman[197801]: @ - - [06/Dec/2025:09:48:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143413 "" "Go-http-client/1.1" Dec 6 04:48:53 localhost podman[197801]: @ - - [06/Dec/2025:09:48:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15452 "" "Go-http-client/1.1" Dec 6 04:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:48:53 localhost podman[220154]: 2025-12-06 09:48:53.572176865 +0000 UTC m=+0.106207353 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc.) Dec 6 04:48:53 localhost podman[220154]: 2025-12-06 09:48:53.588271724 +0000 UTC m=+0.122302202 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=) Dec 6 04:48:53 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:48:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54383 DF PROTO=TCP SPT=35838 DPT=9102 SEQ=1811730822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBEB77C0000000001030307) Dec 6 04:48:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54384 DF PROTO=TCP SPT=35838 DPT=9102 SEQ=1811730822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBEBB870000000001030307) Dec 6 04:48:55 localhost python3.9[220284]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:48:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46089 DF PROTO=TCP SPT=44686 DPT=9102 SEQ=2596043594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBEBD880000000001030307) Dec 6 04:48:55 localhost nova_compute[187174]: 2025-12-06 09:48:55.651 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:48:56 localhost podman[220395]: 2025-12-06 09:48:56.145697125 +0000 UTC m=+0.095930984 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:48:56 localhost podman[220395]: 2025-12-06 09:48:56.181386141 +0000 UTC m=+0.131619980 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:48:56 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:48:56 localhost python3.9[220394]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:48:57 localhost python3.9[220529]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:48:57 localhost nova_compute[187174]: 2025-12-06 09:48:57.853 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:48:58 localhost python3.9[220641]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:48:59 localhost python3.9[220751]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:49:00 localhost python3.9[220863]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:49:00 localhost nova_compute[187174]: 2025-12-06 09:49:00.654 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54386 DF PROTO=TCP SPT=35838 DPT=9102 SEQ=1811730822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBED3470000000001030307) Dec 6 04:49:02 localhost python3.9[220975]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:49:02 localhost network[220992]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:49:02 localhost network[220993]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:49:02 localhost network[220994]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:49:02 localhost nova_compute[187174]: 2025-12-06 09:49:02.884 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:49:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:49:04 localhost podman[221079]: 2025-12-06 09:49:04.724872246 +0000 UTC m=+0.094183510 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 6 04:49:04 localhost podman[221079]: 2025-12-06 09:49:04.761035317 +0000 UTC m=+0.130346581 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 6 04:49:04 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:49:05 localhost nova_compute[187174]: 2025-12-06 09:49:05.657 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:49:06.668 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:49:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:49:06.669 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:49:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:49:06.670 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:49:06 localhost nova_compute[187174]: 2025-12-06 09:49:06.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:06 localhost nova_compute[187174]: 2025-12-06 09:49:06.876 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 04:49:07 localhost python3.9[221253]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 6 04:49:07 localhost nova_compute[187174]: 2025-12-06 09:49:07.889 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:08 localhost python3.9[221363]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Dec 6 04:49:09 localhost python3.9[221473]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54387 DF PROTO=TCP SPT=35838 DPT=9102 SEQ=1811730822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBEF3870000000001030307) Dec 6 04:49:09 localhost python3.9[221530]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:09 localhost nova_compute[187174]: 2025-12-06 09:49:09.894 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:10 localhost auditd[726]: Audit daemon rotating log files Dec 6 04:49:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:49:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:49:10 localhost systemd[1]: tmp-crun.qogTgi.mount: Deactivated successfully. Dec 6 04:49:10 localhost podman[221640]: 2025-12-06 09:49:10.366390872 +0000 UTC m=+0.120223718 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:49:10 localhost podman[221640]: 2025-12-06 09:49:10.376135875 +0000 UTC m=+0.129968701 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:49:10 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:49:10 localhost podman[221642]: 2025-12-06 09:49:10.343889825 +0000 UTC m=+0.094612943 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 04:49:10 localhost podman[221642]: 2025-12-06 09:49:10.428333902 +0000 UTC m=+0.179057070 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.vendor=CentOS) Dec 6 04:49:10 localhost python3.9[221641]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:10 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:49:10 localhost nova_compute[187174]: 2025-12-06 09:49:10.660 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:11 localhost python3.9[221790]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:49:11 localhost nova_compute[187174]: 2025-12-06 09:49:11.870 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:12 localhost python3.9[221900]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:49:12 localhost nova_compute[187174]: 2025-12-06 09:49:12.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:12 localhost nova_compute[187174]: 2025-12-06 09:49:12.925 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:13 localhost python3.9[222012]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:49:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:49:13 localhost podman[222126]: 2025-12-06 09:49:13.757792783 +0000 UTC m=+0.081585220 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:49:13 localhost podman[222126]: 2025-12-06 09:49:13.771139077 +0000 UTC m=+0.094931464 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0) Dec 6 04:49:13 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:49:13 localhost python3.9[222124]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:49:13 localhost podman[222125]: 2025-12-06 09:49:13.86449446 +0000 UTC m=+0.188966297 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Dec 6 04:49:13 localhost podman[222125]: 2025-12-06 09:49:13.872506139 +0000 UTC m=+0.196978006 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:49:13 localhost nova_compute[187174]: 2025-12-06 09:49:13.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:13 localhost nova_compute[187174]: 2025-12-06 09:49:13.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:49:13 localhost nova_compute[187174]: 2025-12-06 09:49:13.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:49:13 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:49:14 localhost nova_compute[187174]: 2025-12-06 09:49:14.402 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:49:14 localhost nova_compute[187174]: 2025-12-06 09:49:14.402 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:49:14 localhost nova_compute[187174]: 2025-12-06 09:49:14.402 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:49:14 localhost nova_compute[187174]: 2025-12-06 09:49:14.403 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:49:14 localhost python3.9[222272]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:15 localhost python3.9[222382]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.661 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.782 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.799 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.799 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.800 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.800 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.801 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.801 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.824 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.826 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.827 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.827 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.899 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.966 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:49:15 localhost nova_compute[187174]: 2025-12-06 09:49:15.967 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.022 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.023 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:49:16 localhost python3.9[222492]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.096 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.097 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.170 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:49:16 localhost openstack_network_exporter[199751]: ERROR 09:49:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:49:16 localhost openstack_network_exporter[199751]: ERROR 09:49:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:49:16 localhost openstack_network_exporter[199751]: ERROR 09:49:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:49:16 localhost openstack_network_exporter[199751]: Dec 6 04:49:16 localhost openstack_network_exporter[199751]: ERROR 09:49:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:49:16 localhost openstack_network_exporter[199751]: ERROR 09:49:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:49:16 localhost openstack_network_exporter[199751]: Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.393 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.395 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12594MB free_disk=387.3086738586426GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.395 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.396 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.626 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.628 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.628 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:49:16 localhost python3.9[222614]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.693 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.767 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.767 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.781 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.805 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.850 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.863 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.865 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.865 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.865 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.893 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.893 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.893 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.894 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 04:49:16 localhost nova_compute[187174]: 2025-12-06 09:49:16.905 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 04:49:17 localhost python3.9[222724]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:17 localhost nova_compute[187174]: 2025-12-06 09:49:17.963 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:18 localhost python3.9[222834]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:49:19 localhost python3.9[222946]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:49:19 localhost python3.9[223056]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:20 localhost python3.9[223113]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:49:20 localhost nova_compute[187174]: 2025-12-06 09:49:20.664 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:20 localhost python3.9[223223]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:20 localhost nova_compute[187174]: 2025-12-06 09:49:20.887 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:21 localhost python3.9[223280]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:49:21 localhost python3.9[223390]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:22 localhost python3.9[223500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:23 localhost nova_compute[187174]: 2025-12-06 09:49:23.005 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:23 localhost podman[197801]: time="2025-12-06T09:49:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:49:23 localhost podman[197801]: @ - - [06/Dec/2025:09:49:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143413 "" "Go-http-client/1.1" Dec 6 04:49:23 localhost python3.9[223557]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:23 localhost podman[197801]: @ - - [06/Dec/2025:09:49:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15448 "" "Go-http-client/1.1" Dec 6 04:49:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:49:23 localhost systemd[1]: tmp-crun.ha9sKy.mount: Deactivated successfully. Dec 6 04:49:23 localhost podman[223668]: 2025-12-06 09:49:23.98702338 +0000 UTC m=+0.088677449 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=) Dec 6 04:49:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15924 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=1187329275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBF2CB20000000001030307) Dec 6 04:49:24 localhost podman[223668]: 2025-12-06 09:49:24.028370732 +0000 UTC m=+0.130024811 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 6 04:49:24 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:49:24 localhost python3.9[223667]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:24 localhost python3.9[223742]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:25 localhost nova_compute[187174]: 2025-12-06 09:49:25.042 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:49:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15925 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=1187329275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBF30C70000000001030307) Dec 6 04:49:25 localhost nova_compute[187174]: 2025-12-06 09:49:25.069 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Triggering sync for uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 6 04:49:25 localhost nova_compute[187174]: 2025-12-06 09:49:25.070 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:49:25 localhost nova_compute[187174]: 2025-12-06 09:49:25.071 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:49:25 localhost nova_compute[187174]: 2025-12-06 09:49:25.120 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:49:25 localhost python3.9[223852]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:49:25 localhost systemd[1]: Reloading. Dec 6 04:49:25 localhost nova_compute[187174]: 2025-12-06 09:49:25.667 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:25 localhost systemd-rc-local-generator[223875]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:49:25 localhost systemd-sysv-generator[223878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:49:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54388 DF PROTO=TCP SPT=35838 DPT=9102 SEQ=1811730822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBF33870000000001030307) Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:25 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:49:26 localhost podman[223999]: 2025-12-06 09:49:26.565248746 +0000 UTC m=+0.094699637 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:49:26 localhost podman[223999]: 2025-12-06 09:49:26.577791404 +0000 UTC m=+0.107242265 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:49:26 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:49:26 localhost python3.9[224000]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15926 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=1187329275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBF38C70000000001030307) Dec 6 04:49:27 localhost python3.9[224079]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46090 DF PROTO=TCP SPT=44686 DPT=9102 SEQ=2596043594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBF3B870000000001030307) Dec 6 04:49:27 localhost python3.9[224189]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:28 localhost nova_compute[187174]: 2025-12-06 09:49:28.070 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:28 localhost python3.9[224246]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:29 localhost python3.9[224356]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:49:29 localhost systemd[1]: Reloading. Dec 6 04:49:29 localhost systemd-rc-local-generator[224381]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:49:29 localhost systemd-sysv-generator[224385]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:49:29 localhost systemd[1]: Starting Create netns directory... Dec 6 04:49:29 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 6 04:49:29 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 6 04:49:29 localhost systemd[1]: Finished Create netns directory. Dec 6 04:49:30 localhost python3.9[224509]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:49:30 localhost nova_compute[187174]: 2025-12-06 09:49:30.669 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15927 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=1187329275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBF48870000000001030307) Dec 6 04:49:31 localhost python3.9[224619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:31 localhost python3.9[224676]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:49:32 localhost python3.9[224786]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:49:33 localhost nova_compute[187174]: 2025-12-06 09:49:33.345 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:33 localhost python3.9[224896]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:34 localhost python3.9[224953]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.edlns02y recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:34 localhost python3.9[225063]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:49:35 localhost podman[225172]: 2025-12-06 09:49:35.558731639 +0000 UTC m=+0.082654953 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3) Dec 6 04:49:35 localhost podman[225172]: 2025-12-06 09:49:35.656907622 +0000 UTC m=+0.180831006 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 04:49:35 localhost nova_compute[187174]: 2025-12-06 09:49:35.671 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:35 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:49:37 localhost python3.9[225366]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Dec 6 04:49:38 localhost python3.9[225476]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:49:38 localhost nova_compute[187174]: 2025-12-06 09:49:38.347 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:39 localhost python3.9[225586]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 6 04:49:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15928 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=1187329275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBF69880000000001030307) Dec 6 04:49:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:49:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:49:40 localhost podman[225631]: 2025-12-06 09:49:40.570294418 +0000 UTC m=+0.098904356 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:49:40 localhost podman[225631]: 2025-12-06 09:49:40.583216099 +0000 UTC m=+0.111826007 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:49:40 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:49:40 localhost podman[225649]: 2025-12-06 09:49:40.658074049 +0000 UTC m=+0.078190285 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 04:49:40 localhost nova_compute[187174]: 2025-12-06 09:49:40.678 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:40 localhost podman[225649]: 2025-12-06 09:49:40.698376889 +0000 UTC m=+0.118493075 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 04:49:40 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:49:43 localhost nova_compute[187174]: 2025-12-06 09:49:43.349 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:43 localhost python3[225766]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:49:43 localhost python3[225766]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7",#012 "Digest": "sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-01T06:11:02.031267563Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 249482216,#012 "VirtualSize": 249482216,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",#012 "sha256:8c448567789503f6c5be645a12473dfc27734872532d528b6ee764c214f9f2f3"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-01T06:09:28.025707917Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025744608Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025767729Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025791379Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.02581523Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025867611Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.469442331Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:02.029095017Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:05.672474685Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:06.113425253Z",#012 Dec 6 04:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:49:44 localhost podman[225847]: 2025-12-06 09:49:44.546094298 +0000 UTC m=+0.078992724 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true) Dec 6 04:49:44 localhost podman[225847]: 2025-12-06 09:49:44.552800748 +0000 UTC m=+0.085699164 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true) Dec 6 04:49:44 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:49:44 localhost podman[225848]: 2025-12-06 09:49:44.599331178 +0000 UTC m=+0.129617972 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute) Dec 6 04:49:44 localhost podman[225848]: 2025-12-06 09:49:44.613129528 +0000 UTC m=+0.143416322 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:49:44 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:49:45 localhost python3.9[225976]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:49:45 localhost nova_compute[187174]: 2025-12-06 09:49:45.696 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:45 localhost python3.9[226088]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:46 localhost openstack_network_exporter[199751]: ERROR 09:49:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:49:46 localhost openstack_network_exporter[199751]: ERROR 09:49:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:49:46 localhost openstack_network_exporter[199751]: ERROR 09:49:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:49:46 localhost openstack_network_exporter[199751]: ERROR 09:49:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:49:46 localhost openstack_network_exporter[199751]: Dec 6 04:49:46 localhost openstack_network_exporter[199751]: ERROR 09:49:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:49:46 localhost openstack_network_exporter[199751]: Dec 6 04:49:46 localhost python3.9[226143]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:49:47 localhost python3.9[226252]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014586.4044857-1364-16279531549547/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:47 localhost python3.9[226307]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:49:48 localhost nova_compute[187174]: 2025-12-06 09:49:48.351 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:49 localhost python3.9[226417]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:49:50 localhost python3.9[226527]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:50 localhost nova_compute[187174]: 2025-12-06 09:49:50.739 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:51 localhost python3.9[226637]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 6 04:49:52 localhost python3.9[226747]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Dec 6 04:49:52 localhost python3.9[226857]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:49:53 localhost python3.9[226914]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:53 localhost nova_compute[187174]: 2025-12-06 09:49:53.389 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:53 localhost podman[197801]: time="2025-12-06T09:49:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:49:53 localhost podman[197801]: @ - - [06/Dec/2025:09:49:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143413 "" "Go-http-client/1.1" Dec 6 04:49:53 localhost podman[197801]: @ - - [06/Dec/2025:09:49:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15438 "" "Go-http-client/1.1" Dec 6 04:49:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21838 DF PROTO=TCP SPT=56074 DPT=9102 SEQ=3617421543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBFA1D90000000001030307) Dec 6 04:49:54 localhost python3.9[227024]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:49:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:49:54 localhost podman[227066]: 2025-12-06 09:49:54.559140624 +0000 UTC m=+0.085146687 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350) Dec 6 04:49:54 localhost podman[227066]: 2025-12-06 09:49:54.575217574 +0000 UTC m=+0.101223607 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64) Dec 6 04:49:54 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:49:55 localhost python3.9[227154]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 6 04:49:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21839 DF PROTO=TCP SPT=56074 DPT=9102 SEQ=3617421543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBFA5C80000000001030307) Dec 6 04:49:55 localhost nova_compute[187174]: 2025-12-06 09:49:55.786 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15929 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=1187329275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBFA9880000000001030307) Dec 6 04:49:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21840 DF PROTO=TCP SPT=56074 DPT=9102 SEQ=3617421543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBFADC70000000001030307) Dec 6 04:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:49:57 localhost systemd[1]: tmp-crun.GHR2ze.mount: Deactivated successfully. Dec 6 04:49:57 localhost podman[227157]: 2025-12-06 09:49:57.170451931 +0000 UTC m=+0.096188381 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:49:57 localhost podman[227157]: 2025-12-06 09:49:57.18131808 +0000 UTC m=+0.107054530 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:49:57 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:49:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54389 DF PROTO=TCP SPT=35838 DPT=9102 SEQ=1811730822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBFB1870000000001030307) Dec 6 04:49:58 localhost nova_compute[187174]: 2025-12-06 09:49:58.420 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:49:58 localhost python3.9[227288]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 6 04:49:59 localhost python3.9[227402]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:00 localhost nova_compute[187174]: 2025-12-06 09:50:00.825 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21841 DF PROTO=TCP SPT=56074 DPT=9102 SEQ=3617421543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBFBD870000000001030307) Dec 6 04:50:01 localhost python3.9[227512]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:50:01 localhost systemd[1]: Reloading. Dec 6 04:50:01 localhost systemd-rc-local-generator[227538]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:50:01 localhost systemd-sysv-generator[227543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:02 localhost python3.9[227656]: ansible-ansible.builtin.service_facts Invoked Dec 6 04:50:02 localhost network[227673]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 6 04:50:02 localhost network[227674]: 'network-scripts' will be removed from distribution in near future. Dec 6 04:50:02 localhost network[227675]: It is advised to switch to 'NetworkManager' instead for network management. Dec 6 04:50:03 localhost nova_compute[187174]: 2025-12-06 09:50:03.471 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:50:05 localhost nova_compute[187174]: 2025-12-06 09:50:05.827 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:50:06 localhost podman[227818]: 2025-12-06 09:50:06.548692462 +0000 UTC m=+0.079914642 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:50:06 localhost podman[227818]: 2025-12-06 09:50:06.584490519 +0000 UTC m=+0.115712699 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 04:50:06 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:50:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:50:06.669 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:50:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:50:06.670 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:50:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:50:06.671 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:50:07 localhost python3.9[227937]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:08 localhost python3.9[228048]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:08 localhost nova_compute[187174]: 2025-12-06 09:50:08.503 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:08 localhost python3.9[228159]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21842 DF PROTO=TCP SPT=56074 DPT=9102 SEQ=3617421543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DBFDD880000000001030307) Dec 6 04:50:09 localhost python3.9[228270]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:10 localhost python3.9[228381]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:50:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:50:10 localhost nova_compute[187174]: 2025-12-06 09:50:10.838 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:10 localhost systemd[1]: tmp-crun.TjldUy.mount: Deactivated successfully. Dec 6 04:50:10 localhost podman[228494]: 2025-12-06 09:50:10.882025291 +0000 UTC m=+0.145586190 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Dec 6 04:50:10 localhost podman[228493]: 2025-12-06 09:50:10.84156311 +0000 UTC m=+0.107793963 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:50:10 localhost podman[228494]: 2025-12-06 09:50:10.897278197 +0000 UTC m=+0.160839076 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 04:50:10 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:50:10 localhost podman[228493]: 2025-12-06 09:50:10.925316011 +0000 UTC m=+0.191546824 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:50:10 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:50:11 localhost python3.9[228492]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:11 localhost nova_compute[187174]: 2025-12-06 09:50:11.899 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:12 localhost python3.9[228646]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:13 localhost python3.9[228757]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:50:13 localhost nova_compute[187174]: 2025-12-06 09:50:13.543 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:13 localhost nova_compute[187174]: 2025-12-06 09:50:13.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:13 localhost nova_compute[187174]: 2025-12-06 09:50:13.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:50:13 localhost nova_compute[187174]: 2025-12-06 09:50:13.876 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:50:14 localhost python3.9[228868]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:14 localhost nova_compute[187174]: 2025-12-06 09:50:14.548 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:50:14 localhost nova_compute[187174]: 2025-12-06 09:50:14.549 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:50:14 localhost nova_compute[187174]: 2025-12-06 09:50:14.549 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:50:14 localhost nova_compute[187174]: 2025-12-06 09:50:14.549 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:50:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:50:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:50:14 localhost podman[228979]: 2025-12-06 09:50:14.995789634 +0000 UTC m=+0.093061112 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:50:15 localhost podman[228979]: 2025-12-06 09:50:15.024757167 +0000 UTC m=+0.122028665 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:50:15 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:50:15 localhost podman[228980]: 2025-12-06 09:50:15.044829843 +0000 UTC m=+0.138862560 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute) Dec 6 04:50:15 localhost podman[228980]: 2025-12-06 09:50:15.059257682 +0000 UTC m=+0.153290459 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:50:15 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:50:15 localhost python3.9[228978]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:15 localhost python3.9[229126]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:15 localhost nova_compute[187174]: 2025-12-06 09:50:15.839 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.055 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.071 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.072 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.073 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.073 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.074 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.098 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.099 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.100 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.100 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.168 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:50:16 localhost openstack_network_exporter[199751]: ERROR 09:50:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:50:16 localhost openstack_network_exporter[199751]: ERROR 09:50:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:50:16 localhost openstack_network_exporter[199751]: ERROR 09:50:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:50:16 localhost openstack_network_exporter[199751]: ERROR 09:50:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:50:16 localhost openstack_network_exporter[199751]: Dec 6 04:50:16 localhost openstack_network_exporter[199751]: ERROR 09:50:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:50:16 localhost openstack_network_exporter[199751]: Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.250 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.252 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:50:16 localhost python3.9[229236]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.298 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.299 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.371 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.372 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.424 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.601 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.602 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12590MB free_disk=387.30757904052734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.602 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.603 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.665 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.666 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.666 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.712 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.730 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.733 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:50:16 localhost nova_compute[187174]: 2025-12-06 09:50:16.733 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:50:16 localhost python3.9[229358]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:17 localhost python3.9[229468]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:17 localhost nova_compute[187174]: 2025-12-06 09:50:17.536 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:17 localhost nova_compute[187174]: 2025-12-06 09:50:17.537 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:17 localhost nova_compute[187174]: 2025-12-06 09:50:17.874 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:17 localhost nova_compute[187174]: 2025-12-06 09:50:17.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:50:18 localhost python3.9[229578]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:18 localhost nova_compute[187174]: 2025-12-06 09:50:18.590 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:18 localhost python3.9[229688]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:19 localhost python3.9[229798]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:19 localhost python3.9[229908]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:20 localhost python3.9[230018]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:20 localhost python3.9[230128]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:20 localhost nova_compute[187174]: 2025-12-06 09:50:20.841 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:21 localhost python3.9[230238]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:21 localhost python3.9[230348]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:22 localhost python3.9[230458]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:22 localhost nova_compute[187174]: 2025-12-06 09:50:22.876 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:50:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:22.985 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:50:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:22.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:50:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:22.986 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:50:22 localhost python3.9[230568]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.008 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 52.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d0a89b3-3cf1-4a09-8c9e-58f338323c41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.35546875, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:50:22.986196', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fbc87082-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.218435697, 'message_signature': 'ce4b03b4621b1adb40893667f13664e9568fb3cb565a426bded1243e7e262a6b'}]}, 'timestamp': '2025-12-06 09:50:23.009090', '_unique_id': 'ee780f3b9c694dc09189dcebada6ab2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.010 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.014 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da430a82-d62c-4cb8-8780-e9c8fd203568', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 123, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.012034', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbc961fe-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '40a8d3a451e3e6ba526141d9d2ee18399cf7e0eb5d445d54df0f3d4532a4f785'}]}, 'timestamp': '2025-12-06 09:50:23.015213', '_unique_id': 'c8ae3f73d5ee4274afabfb724ba7fbb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.016 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.017 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.035 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.035 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9ebba39-011e-4777-a83e-fda5f9e3d172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.017464', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbcc82ee-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.227575091, 'message_signature': '2e99419c449b78d6485f1ec64ce9ef340c6955cbdfc8d6aad6718a7776ea9242'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.017464', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbcc9842-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.227575091, 'message_signature': '6a3db7344a8bd7b15c58f6a4ca35d000568d3756f8f986c054f0019140f4ebaa'}]}, 'timestamp': '2025-12-06 09:50:23.036256', '_unique_id': '0a156ab22c254e8bbc21c8e6f9e3322c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.037 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.039 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c7e01be-94a3-47f7-aa90-fa5a73c12881', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.039709', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbcd37e8-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '0161a3d1a44f47b292f9076dbee708ac14fffc97d459e3211d61aace5c4137d8'}]}, 'timestamp': '2025-12-06 09:50:23.040462', '_unique_id': '734d354a7de948c49772f92f059c37b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.043 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.043 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 55680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c82762e4-e637-416d-b738-9305ffbcde86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55680000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:50:23.043706', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fbcdd3ec-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.218435697, 'message_signature': '9c7ab3c3783c5a9bd1ec50fa1a4d472d0c3b5f7fceaa42223d752ab87a6dc7ee'}]}, 'timestamp': '2025-12-06 09:50:23.044426', '_unique_id': '00f0b46abc254daf9f45e2bbaede1df2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.092 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c6fcf54-b0fe-4838-bce4-e1c8488add69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.048061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd52868-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': 'e8b9927ef99bb84e1bf0c058ded3a5e75b4d9509734913f192a0d68c27f9377f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.048061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd53d44-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': 'c4093327f939336bf148887fda0b0a0cd2e67503323bd92ec94e7dd826163798'}]}, 'timestamp': '2025-12-06 09:50:23.092940', '_unique_id': '715ef5bca7e749508f85489115fff32a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90a37275-08be-4ce4-9d74-7f9db03564c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.096080', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbd5cfb6-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '177bad6b38061ddca28c5df9fbcbfab6392ee5cb147ad221a461324259eafc43'}]}, 'timestamp': '2025-12-06 09:50:23.096659', '_unique_id': 'f83e32f2edfd4c3fb9f0c3b43f88036a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.098 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 10055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9833aa8f-3e26-4542-8c0a-fa6df4897077', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10055, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.098373', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbd6229a-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '6594469d1f382b5c4a4f522ecf13db01dae10c582292e272db6d4567964323a5'}]}, 'timestamp': '2025-12-06 09:50:23.100032', '_unique_id': 'c73b8d69af9844d8a8f15571ed482218'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.101 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.101 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e440cb7-d827-4faa-9ee8-74139479e731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.101527', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd69d74-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.227575091, 'message_signature': '76bea7e0c9ed368c65a781946cfaff9f22b664a7aa9f7ceee893f8081b837650'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.101527', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd6aaee-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.227575091, 'message_signature': 'b5c467ab00b141e6bc3f7a528e9a6ff462fac77204a43e341b927eecf82c1f01'}]}, 'timestamp': '2025-12-06 09:50:23.102165', '_unique_id': 'a02d70add8694ecba1218f0dcdeef9b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76ef8938-1341-4951-bccd-eb9b74370b88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.103622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd6ef22-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': 'b88f04c232ed0c679c0221b67363a1c3dfa3561b6d2f80a396d18ee581d263fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.103622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd6fc06-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': 'db13b1cd256f96f29eac4796955f4f42af67b1a84e671a002bd6bd881d7d6d09'}]}, 'timestamp': '2025-12-06 09:50:23.104234', '_unique_id': 'b4eeb69164c2415a9e4d23894389dfbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.105 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b3ec9dd-7b30-40db-91bf-4492b682318d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.105607', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbd73c98-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '676d22cdc0aff2948ed433a6a8961e6c5ab58e6c8ca8ebed56a443a2411227f4'}]}, 'timestamp': '2025-12-06 09:50:23.105928', '_unique_id': '26e15146744f44ad9e42b9ebce5c8fdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.107 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 10762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b411b9d-e9a3-4acb-8264-e331bec9e588', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10762, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.107257', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbd77cf8-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '223d81624d5cc95ccbb3fc30707fe555a5c74fbab5760786987bfb9957eaeb89'}]}, 'timestamp': '2025-12-06 09:50:23.107580', '_unique_id': '26685eb421b049d39fafc75661198087'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.108 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31064064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9c76d56-c2a3-41b4-b4b9-bd26057970d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31064064, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.108967', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd7bfd8-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.227575091, 'message_signature': '85fbb83e751963e5818cd64e16179af9e87be2a49e705f6ab09b7df6c911d56f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.108967', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd7ca6e-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.227575091, 'message_signature': '91bc4ecc50979f3d5436c6d65d32ffa4c6def106696a3dd165da8c383a608473'}]}, 'timestamp': '2025-12-06 09:50:23.109518', '_unique_id': '85fc6ca1802b409191295871ca28c9b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.110 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14f488b3-50c0-4642-92a6-213f0769b63d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.110924', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbd80c68-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': 'da83966a8e7742b216a435b54e387dfdb0d12e3da2c011d2b6352fca7234d0de'}]}, 'timestamp': '2025-12-06 09:50:23.111244', '_unique_id': 'aff8d19a35ed4f918326121e726997a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 301237008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 37411248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9330f60-d4c6-4afa-a2d1-c45acd00098c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 301237008, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.112671', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd8504c-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': 'd2d343bdc356e1caeb8868c5ac1896f2530c848b094f0983cc27d2e9e5b0e783'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 37411248, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.112671', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd85baa-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': '6465be717c3e869a4738aa73afa594acd7a36fcc138ca639b59c57600b80e2df'}]}, 'timestamp': '2025-12-06 09:50:23.113235', '_unique_id': '1a7856a2c1344359ab2f2f224f171ffd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.114 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81eb4ff9-4476-4d6f-a73a-eb8bc8633d1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.114582', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbd89b06-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '3b54911527d37b7ab1051cebd998de1dd850409526d63859a5ab8577f3d04987'}]}, 'timestamp': '2025-12-06 09:50:23.114895', '_unique_id': '3cff7b6416964ae3b009c266470753e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.116 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.116 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afc4f91d-ce85-4c9b-92fb-b8ca51688a2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.116337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd8df94-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': 'e06dd25e0d0640b62739f352701bb246239e3d8808e0aa93cb59c663df69c38b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.116337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd8e9da-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': '8ca1aa6b76a78a8344415d11c9209e7ba04d0229f3782e3d2d7d43fa4dec6d6a'}]}, 'timestamp': '2025-12-06 09:50:23.116894', '_unique_id': '0f4dbe953384424985e7ba5609348a8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1488c211-1fc2-4561-a5f1-eeab290e2b93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.118261', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbd92ab2-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': 'd27b13d32346a7caabccb84742b62df9e18681efdee611db7a071940bce2fa16'}]}, 'timestamp': '2025-12-06 09:50:23.118552', '_unique_id': 'bf4ff628c4704428b62d276cdcc7b6c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.119 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 947163713 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 9516486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09f966fa-6cd6-43a6-9e26-98ffc125dd33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 947163713, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.119927', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd96bda-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': '14a7e819a559e75ac75bbfb46b2347575a581b61a2b135bff3d6bd54fb4bfa4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9516486, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.119927', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd97670-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': '964b0bc37b5ac2acc2e6bef793bcf68b98f7f5cfb058fa9fffb1775553e47753'}]}, 'timestamp': '2025-12-06 09:50:23.120473', '_unique_id': '8444b3c0f99d4f08a80124fa08dd8548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 73904128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ce6444e-2bb5-42d1-983d-feee733fbcb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73904128, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:50:23.121935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbd9ba72-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': 'bcf07252902880ac4ad02f783ec01c05232550133ead96a2433ec093a53d87fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:50:23.121935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbd9c4fe-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.258231287, 'message_signature': '50b15cd0f63ede561dde027ff55a13ed69f85c747675c7370bc087fad884763c'}]}, 'timestamp': '2025-12-06 09:50:23.122484', '_unique_id': 'b0b9e88b3a394893a861b2e0119f8286'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 100 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '754ca1ba-17a5-4a4a-8be8-799098508b58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 100, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:50:23.123908', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'fbda0798-d288-11f0-8fed-fa163edf398d', 'monotonic_time': 11010.222148832, 'message_signature': '68894534ad061bff92419ac583f0e2d8a0049a420243951ee869f73a9e5277e8'}]}, 'timestamp': '2025-12-06 09:50:23.124207', '_unique_id': '4fcbe0463ec843d0a00dbfadb5e54b4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:50:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:50:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 04:50:23 localhost podman[197801]: time="2025-12-06T09:50:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:50:23 localhost podman[197801]: @ - - [06/Dec/2025:09:50:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143413 "" "Go-http-client/1.1" Dec 6 04:50:23 localhost podman[197801]: @ - - [06/Dec/2025:09:50:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15442 "" "Go-http-client/1.1" Dec 6 04:50:23 localhost nova_compute[187174]: 2025-12-06 09:50:23.625 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17649 DF PROTO=TCP SPT=54084 DPT=9102 SEQ=3369364799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC017090000000001030307) Dec 6 04:50:24 localhost python3.9[230678]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17650 DF PROTO=TCP SPT=54084 DPT=9102 SEQ=3369364799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC01B070000000001030307) Dec 6 04:50:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:50:25 localhost podman[230698]: 2025-12-06 09:50:25.557662761 +0000 UTC m=+0.087285043 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 6 04:50:25 localhost podman[230698]: 2025-12-06 09:50:25.565259138 +0000 UTC m=+0.094881410 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 04:50:25 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:50:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21843 DF PROTO=TCP SPT=56074 DPT=9102 SEQ=3617421543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC01D870000000001030307) Dec 6 04:50:25 localhost nova_compute[187174]: 2025-12-06 09:50:25.844 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:26 localhost python3.9[230808]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 6 04:50:26 localhost python3.9[230918]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 6 04:50:26 localhost systemd[1]: Reloading. Dec 6 04:50:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17651 DF PROTO=TCP SPT=54084 DPT=9102 SEQ=3369364799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC023070000000001030307) Dec 6 04:50:27 localhost systemd-rc-local-generator[230942]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 6 04:50:27 localhost systemd-sysv-generator[230948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 6 04:50:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:50:27 localhost podman[230955]: 2025-12-06 09:50:27.414417202 +0000 UTC m=+0.084622719 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:50:27 localhost podman[230955]: 2025-12-06 09:50:27.448280608 +0000 UTC m=+0.118486115 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:50:27 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:50:28 localhost python3.9[231086]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15930 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=1187329275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC027880000000001030307) Dec 6 04:50:28 localhost python3.9[231197]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:28 localhost nova_compute[187174]: 2025-12-06 09:50:28.666 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:29 localhost python3.9[231308]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:29 localhost sshd[231381]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:50:29 localhost python3.9[231421]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:30 localhost python3.9[231532]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:30 localhost nova_compute[187174]: 2025-12-06 09:50:30.847 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17652 DF PROTO=TCP SPT=54084 DPT=9102 SEQ=3369364799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC032C70000000001030307) Dec 6 04:50:31 localhost python3.9[231643]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:31 localhost python3.9[231754]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:32 localhost python3.9[231865]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:50:33 localhost nova_compute[187174]: 2025-12-06 09:50:33.702 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:35 localhost python3.9[231976]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:35 localhost nova_compute[187174]: 2025-12-06 09:50:35.849 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:36 localhost python3.9[232086]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:36 localhost python3.9[232196]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:50:37 localhost podman[232307]: 2025-12-06 09:50:37.368874011 +0000 UTC m=+0.086480908 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 04:50:37 localhost podman[232307]: 2025-12-06 09:50:37.410417786 +0000 UTC m=+0.128024693 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:50:37 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:50:37 localhost python3.9[232306]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:38 localhost python3.9[232441]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:38 localhost python3.9[232551]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:38 localhost nova_compute[187174]: 2025-12-06 09:50:38.741 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:39 localhost python3.9[232661]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17653 DF PROTO=TCP SPT=54084 DPT=9102 SEQ=3369364799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC053870000000001030307) Dec 6 04:50:40 localhost python3.9[232771]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:40 localhost python3.9[232881]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:40 localhost nova_compute[187174]: 2025-12-06 09:50:40.852 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:50:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:50:41 localhost systemd[1]: tmp-crun.jY4UvC.mount: Deactivated successfully. Dec 6 04:50:41 localhost podman[232993]: 2025-12-06 09:50:41.183254638 +0000 UTC m=+0.077295151 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 04:50:41 localhost podman[232993]: 2025-12-06 09:50:41.215666558 +0000 UTC m=+0.109707051 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 04:50:41 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:50:41 localhost podman[232992]: 2025-12-06 09:50:41.224974749 +0000 UTC m=+0.119565229 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:50:41 localhost python3.9[232991]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:41 localhost podman[232992]: 2025-12-06 09:50:41.30519607 +0000 UTC m=+0.199786480 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:50:41 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:50:43 localhost nova_compute[187174]: 2025-12-06 09:50:43.785 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:50:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:50:45 localhost podman[233052]: 2025-12-06 09:50:45.61736775 +0000 UTC m=+0.134471604 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:50:45 localhost podman[233052]: 2025-12-06 09:50:45.658658347 +0000 UTC m=+0.175762221 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 04:50:45 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:50:45 localhost podman[233051]: 2025-12-06 09:50:45.581720508 +0000 UTC m=+0.101312339 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent) Dec 6 04:50:45 localhost podman[233051]: 2025-12-06 09:50:45.711621068 +0000 UTC m=+0.231212929 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 04:50:45 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:50:45 localhost nova_compute[187174]: 2025-12-06 09:50:45.856 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:46 localhost openstack_network_exporter[199751]: ERROR 09:50:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:50:46 localhost openstack_network_exporter[199751]: ERROR 09:50:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:50:46 localhost openstack_network_exporter[199751]: ERROR 09:50:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:50:46 localhost openstack_network_exporter[199751]: ERROR 09:50:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:50:46 localhost openstack_network_exporter[199751]: Dec 6 04:50:46 localhost openstack_network_exporter[199751]: ERROR 09:50:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:50:46 localhost openstack_network_exporter[199751]: Dec 6 04:50:46 localhost python3.9[233181]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Dec 6 04:50:47 localhost sshd[233200]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:50:48 localhost systemd-logind[760]: New session 46 of user zuul. Dec 6 04:50:48 localhost systemd[1]: Started Session 46 of User zuul. Dec 6 04:50:48 localhost systemd[1]: session-46.scope: Deactivated successfully. Dec 6 04:50:48 localhost systemd-logind[760]: Session 46 logged out. Waiting for processes to exit. Dec 6 04:50:48 localhost systemd-logind[760]: Removed session 46. Dec 6 04:50:48 localhost nova_compute[187174]: 2025-12-06 09:50:48.826 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:48 localhost python3.9[233311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:50:49 localhost python3.9[233397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014648.4801383-3017-122953886287585/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:50 localhost python3.9[233505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:50:50 localhost python3.9[233560]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:50 localhost nova_compute[187174]: 2025-12-06 09:50:50.858 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:51 localhost python3.9[233668]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:50:51 localhost python3.9[233754]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014650.9335272-3017-60922679696962/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:52 localhost python3.9[233862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:50:52 localhost python3.9[233948]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014652.0090494-3017-194304615418218/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=4a38887ad65f37f06de8a6f0571c8572a75472b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:53 localhost podman[197801]: time="2025-12-06T09:50:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:50:53 localhost podman[197801]: @ - - [06/Dec/2025:09:50:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143413 "" "Go-http-client/1.1" Dec 6 04:50:53 localhost podman[197801]: @ - - [06/Dec/2025:09:50:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15444 "" "Go-http-client/1.1" Dec 6 04:50:53 localhost python3.9[234056]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:50:53 localhost nova_compute[187174]: 2025-12-06 09:50:53.859 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10245 DF PROTO=TCP SPT=39214 DPT=9102 SEQ=54658559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC08C390000000001030307) Dec 6 04:50:54 localhost python3.9[234142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014653.1132455-3017-20000177268688/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:54 localhost python3.9[234250]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:50:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10246 DF PROTO=TCP SPT=39214 DPT=9102 SEQ=54658559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC090480000000001030307) Dec 6 04:50:55 localhost python3.9[234336]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014654.3357463-3017-79399412511895/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:50:55 localhost nova_compute[187174]: 2025-12-06 09:50:55.861 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17654 DF PROTO=TCP SPT=54084 DPT=9102 SEQ=3369364799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC093870000000001030307) Dec 6 04:50:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:50:56 localhost podman[234447]: 2025-12-06 09:50:56.014174139 +0000 UTC m=+0.095742126 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 04:50:56 localhost podman[234447]: 2025-12-06 09:50:56.058223333 +0000 UTC m=+0.139791310 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 04:50:56 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:50:56 localhost python3.9[234446]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:56 localhost python3.9[234575]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10247 DF PROTO=TCP SPT=39214 DPT=9102 SEQ=54658559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC098470000000001030307) Dec 6 04:50:57 localhost python3.9[234685]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:50:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21844 DF PROTO=TCP SPT=56074 DPT=9102 SEQ=3617421543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC09B870000000001030307) Dec 6 04:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:50:58 localhost podman[234797]: 2025-12-06 09:50:58.082971812 +0000 UTC m=+0.083861566 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:50:58 localhost podman[234797]: 2025-12-06 09:50:58.094273994 +0000 UTC m=+0.095163738 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:50:58 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:50:58 localhost python3.9[234798]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:50:58 localhost nova_compute[187174]: 2025-12-06 09:50:58.893 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:50:58 localhost python3.9[234929]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:50:59 localhost python3.9[235039]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:51:00 localhost nova_compute[187174]: 2025-12-06 09:51:00.863 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:00 localhost python3.9[235094]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:51:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10248 DF PROTO=TCP SPT=39214 DPT=9102 SEQ=54658559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC0A8080000000001030307) Dec 6 04:51:01 localhost python3.9[235202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 6 04:51:01 localhost python3.9[235257]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 6 04:51:02 localhost python3.9[235367]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Dec 6 04:51:03 localhost python3.9[235477]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:51:03 localhost nova_compute[187174]: 2025-12-06 09:51:03.922 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:04 localhost python3[235587]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:51:05 localhost python3[235587]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",#012 "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-01T06:31:10.62653219Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211779450,#012 "VirtualSize": 1211779450,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",#012 "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",#012 "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",#012 "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-01T06:09:28.025707917Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025744608Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025767729Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025791379Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.02581523Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025867611Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.469442331Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:02.029095017Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Dec 6 04:51:05 localhost python3.9[235757]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:51:05 localhost nova_compute[187174]: 2025-12-06 09:51:05.866 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:51:06.671 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:51:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:51:06.671 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:51:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:51:06.673 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:51:07 localhost python3.9[235869]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Dec 6 04:51:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:51:07 localhost systemd[1]: tmp-crun.o1LvdB.mount: Deactivated successfully. Dec 6 04:51:07 localhost podman[235949]: 2025-12-06 09:51:07.542344874 +0000 UTC m=+0.077634042 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 04:51:07 localhost podman[235949]: 2025-12-06 09:51:07.596955706 +0000 UTC m=+0.132244854 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, container_name=ovn_controller) Dec 6 04:51:07 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:51:07 localhost python3.9[236001]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Dec 6 04:51:08 localhost python3[236114]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Dec 6 04:51:08 localhost nova_compute[187174]: 2025-12-06 09:51:08.954 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:08 localhost python3[236114]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",#012 "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-01T06:31:10.62653219Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211779450,#012 "VirtualSize": 1211779450,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",#012 "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",#012 "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",#012 "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-01T06:09:28.025707917Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025744608Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025767729Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025791379Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.02581523Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.025867611Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:09:28.469442331Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-01T06:10:02.029095017Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Dec 6 04:51:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10249 DF PROTO=TCP SPT=39214 DPT=9102 SEQ=54658559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC0C7870000000001030307) Dec 6 04:51:09 localhost python3.9[236288]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:51:10 localhost python3.9[236400]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:51:10 localhost nova_compute[187174]: 2025-12-06 09:51:10.869 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:11 localhost python3.9[236509]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014670.5758731-3695-133086619958455/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 6 04:51:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:51:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:51:11 localhost podman[236565]: 2025-12-06 09:51:11.542396881 +0000 UTC m=+0.085614481 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:51:11 localhost podman[236565]: 2025-12-06 09:51:11.555265031 +0000 UTC m=+0.098482601 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:51:11 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:51:11 localhost podman[236566]: 2025-12-06 09:51:11.637057672 +0000 UTC m=+0.141878765 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:51:11 localhost podman[236566]: 2025-12-06 09:51:11.675363066 +0000 UTC m=+0.180184149 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125) Dec 6 04:51:11 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:51:11 localhost python3.9[236564]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 6 04:51:13 localhost nova_compute[187174]: 2025-12-06 09:51:13.871 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:14 localhost nova_compute[187174]: 2025-12-06 09:51:13.998 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:14 localhost python3.9[236713]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:51:14 localhost nova_compute[187174]: 2025-12-06 09:51:14.874 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:14 localhost nova_compute[187174]: 2025-12-06 09:51:14.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:51:14 localhost nova_compute[187174]: 2025-12-06 09:51:14.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:51:15 localhost python3.9[236821]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:51:15 localhost nova_compute[187174]: 2025-12-06 09:51:15.469 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:51:15 localhost nova_compute[187174]: 2025-12-06 09:51:15.469 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:51:15 localhost nova_compute[187174]: 2025-12-06 09:51:15.470 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:51:15 localhost nova_compute[187174]: 2025-12-06 09:51:15.470 187178 DEBUG nova.objects.instance [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:51:15 localhost nova_compute[187174]: 2025-12-06 09:51:15.876 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:15 localhost python3.9[236929]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 6 04:51:16 localhost openstack_network_exporter[199751]: ERROR 09:51:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:51:16 localhost openstack_network_exporter[199751]: ERROR 09:51:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:51:16 localhost openstack_network_exporter[199751]: ERROR 09:51:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:51:16 localhost openstack_network_exporter[199751]: ERROR 09:51:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:51:16 localhost openstack_network_exporter[199751]: Dec 6 04:51:16 localhost openstack_network_exporter[199751]: ERROR 09:51:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:51:16 localhost openstack_network_exporter[199751]: Dec 6 04:51:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:51:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:51:16 localhost podman[236986]: 2025-12-06 09:51:16.559270711 +0000 UTC m=+0.087935224 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Dec 6 04:51:16 localhost podman[236986]: 2025-12-06 09:51:16.597225594 +0000 UTC m=+0.125890077 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:51:16 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.618 187178 DEBUG nova.network.neutron [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:51:16 localhost systemd[1]: tmp-crun.UowNA3.mount: Deactivated successfully. Dec 6 04:51:16 localhost podman[236985]: 2025-12-06 09:51:16.647100569 +0000 UTC m=+0.177329910 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.652 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.652 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.653 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.653 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.673 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.673 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.674 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.674 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:51:16 localhost podman[236985]: 2025-12-06 09:51:16.682461772 +0000 UTC m=+0.212691073 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 04:51:16 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.751 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.822 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.824 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.894 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.896 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.965 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:16 localhost nova_compute[187174]: 2025-12-06 09:51:16.966 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.017 187178 DEBUG oslo_concurrency.processutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:17 localhost python3.9[237078]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.225 187178 WARNING nova.virt.libvirt.driver [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:51:17 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 117.7 (392 of 333 items), suggesting rotation. Dec 6 04:51:17 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 04:51:17 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.226 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12569MB free_disk=387.3062744140625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.226 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.227 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:51:17 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.374 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.374 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.375 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.435 187178 DEBUG nova.compute.provider_tree [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.469 187178 DEBUG nova.scheduler.client.report [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.472 187178 DEBUG nova.compute.resource_tracker [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.472 187178 DEBUG oslo_concurrency.lockutils [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.695 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.715 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:17 localhost nova_compute[187174]: 2025-12-06 09:51:17.875 187178 DEBUG nova.compute.manager [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:51:18 localhost python3.9[237221]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 6 04:51:18 localhost systemd[1]: Stopping nova_compute container... Dec 6 04:51:18 localhost nova_compute[187174]: 2025-12-06 09:51:18.264 187178 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Dec 6 04:51:18 localhost nova_compute[187174]: 2025-12-06 09:51:18.875 187178 DEBUG oslo_service.periodic_task [None req-4db9cc93-1fc6-455e-bb18-a5d1a9799597 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:51:19 localhost nova_compute[187174]: 2025-12-06 09:51:19.033 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:20 localhost nova_compute[187174]: 2025-12-06 09:51:20.874 187178 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:23 localhost nova_compute[187174]: 2025-12-06 09:51:23.060 187178 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Dec 6 04:51:23 localhost nova_compute[187174]: 2025-12-06 09:51:23.063 187178 DEBUG oslo_concurrency.lockutils [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:51:23 localhost nova_compute[187174]: 2025-12-06 09:51:23.063 187178 DEBUG oslo_concurrency.lockutils [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:51:23 localhost nova_compute[187174]: 2025-12-06 09:51:23.064 187178 DEBUG oslo_concurrency.lockutils [None req-8af623f6-83a8-43b2-aacb-98a8d5b7f772 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:51:23 localhost podman[197801]: time="2025-12-06T09:51:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:51:23 localhost podman[197801]: @ - - [06/Dec/2025:09:51:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143414 "" "Go-http-client/1.1" Dec 6 04:51:23 localhost podman[197801]: @ - - [06/Dec/2025:09:51:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15319 "" "Go-http-client/1.1" Dec 6 04:51:23 localhost journal[161777]: End of file while reading data: Input/output error Dec 6 04:51:23 localhost systemd[1]: libpod-5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486.scope: Deactivated successfully. Dec 6 04:51:23 localhost systemd[1]: libpod-5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486.scope: Consumed 12.890s CPU time. Dec 6 04:51:23 localhost podman[237225]: 2025-12-06 09:51:23.449014591 +0000 UTC m=+5.250053708 container died 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2) Dec 6 04:51:23 localhost systemd[1]: tmp-crun.x7NBlH.mount: Deactivated successfully. Dec 6 04:51:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486-userdata-shm.mount: Deactivated successfully. Dec 6 04:51:23 localhost podman[237225]: 2025-12-06 09:51:23.607090861 +0000 UTC m=+5.408129958 container cleanup 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 6 04:51:23 localhost podman[237225]: nova_compute Dec 6 04:51:23 localhost podman[237264]: error opening file `/run/crun/5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486/status`: No such file or directory Dec 6 04:51:23 localhost podman[237253]: 2025-12-06 09:51:23.711048452 +0000 UTC m=+0.067318500 container cleanup 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:51:23 localhost podman[237253]: nova_compute Dec 6 04:51:23 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Dec 6 04:51:23 localhost systemd[1]: Stopped nova_compute container. Dec 6 04:51:23 localhost systemd[1]: Starting nova_compute container... Dec 6 04:51:23 localhost systemd[1]: Started libcrun container. Dec 6 04:51:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44e2b2b0e3a4403fc02d8910c6f74be67815868ebcac65117d9ea4fa1ca15530/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:23 localhost podman[237266]: 2025-12-06 09:51:23.854232456 +0000 UTC m=+0.115735620 container init 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:51:23 localhost podman[237266]: 2025-12-06 09:51:23.863630999 +0000 UTC m=+0.125134153 container start 5c82a5cc28e0c5ca0a5845cd464a5c4ce9d5a7c9012a63253b9449a2f1342486 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 04:51:23 localhost podman[237266]: nova_compute Dec 6 04:51:23 localhost nova_compute[237281]: + sudo -E kolla_set_configs Dec 6 04:51:23 localhost systemd[1]: Started nova_compute container. Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Validating config file Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying service configuration files Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /etc/nova/nova.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /etc/nova/nova.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /etc/ceph Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Creating directory /etc/ceph Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /etc/ceph Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Deleting /usr/sbin/iscsiadm Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Writing out command to execute Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:51:23 localhost nova_compute[237281]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 6 04:51:23 localhost nova_compute[237281]: ++ cat /run_command Dec 6 04:51:23 localhost nova_compute[237281]: + CMD=nova-compute Dec 6 04:51:23 localhost nova_compute[237281]: + ARGS= Dec 6 04:51:23 localhost nova_compute[237281]: + sudo kolla_copy_cacerts Dec 6 04:51:23 localhost nova_compute[237281]: + [[ ! -n '' ]] Dec 6 04:51:23 localhost nova_compute[237281]: + . kolla_extend_start Dec 6 04:51:23 localhost nova_compute[237281]: Running command: 'nova-compute' Dec 6 04:51:23 localhost nova_compute[237281]: + echo 'Running command: '\''nova-compute'\''' Dec 6 04:51:23 localhost nova_compute[237281]: + umask 0022 Dec 6 04:51:23 localhost nova_compute[237281]: + exec nova-compute Dec 6 04:51:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14387 DF PROTO=TCP SPT=50264 DPT=9102 SEQ=2109830551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1016A0000000001030307) Dec 6 04:51:24 localhost python3.9[237403]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 6 04:51:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14388 DF PROTO=TCP SPT=50264 DPT=9102 SEQ=2109830551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC105870000000001030307) Dec 6 04:51:25 localhost systemd[1]: Started libpod-conmon-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00.scope. Dec 6 04:51:25 localhost systemd[1]: Started libcrun container. Dec 6 04:51:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Dec 6 04:51:25 localhost podman[237429]: 2025-12-06 09:51:25.202582996 +0000 UTC m=+0.130425847 container init 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2) Dec 6 04:51:25 localhost podman[237429]: 2025-12-06 09:51:25.217081758 +0000 UTC m=+0.144924599 container start 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:51:25 localhost python3.9[237403]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Applying nova statedir ownership Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93 already 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93 to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.info Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/console.log Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/3e070c3db7ba7309de3805d58aaf4369c4bd45c2 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 107 gid: 107 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-3e070c3db7ba7309de3805d58aaf4369c4bd45c2 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673 Dec 6 04:51:25 localhost nova_compute_init[237448]: INFO:nova_statedir:Nova statedir ownership complete Dec 6 04:51:25 localhost systemd[1]: libpod-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00.scope: Deactivated successfully. Dec 6 04:51:25 localhost podman[237461]: 2025-12-06 09:51:25.346280586 +0000 UTC m=+0.047562624 container died 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 6 04:51:25 localhost podman[237461]: 2025-12-06 09:51:25.422551415 +0000 UTC m=+0.123833403 container cleanup 9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:51:25 localhost systemd[1]: libpod-conmon-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00.scope: Deactivated successfully. Dec 6 04:51:25 localhost systemd[1]: var-lib-containers-storage-overlay-921081be0d3584208f178cb345f2615cfcd1617609a73e0727b158d9f013eee4-merged.mount: Deactivated successfully. Dec 6 04:51:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a8c0d333f58e265f334be2ca071fc6918e2209d87f0ab653f2e7d0029e3ec00-userdata-shm.mount: Deactivated successfully. Dec 6 04:51:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10250 DF PROTO=TCP SPT=39214 DPT=9102 SEQ=54658559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC107870000000001030307) Dec 6 04:51:25 localhost nova_compute[237281]: 2025-12-06 09:51:25.719 237285 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:51:25 localhost nova_compute[237281]: 2025-12-06 09:51:25.719 237285 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:51:25 localhost nova_compute[237281]: 2025-12-06 09:51:25.719 237285 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 6 04:51:25 localhost nova_compute[237281]: 2025-12-06 09:51:25.719 237285 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Dec 6 04:51:25 localhost nova_compute[237281]: 2025-12-06 09:51:25.853 237285 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:25 localhost nova_compute[237281]: 2025-12-06 09:51:25.876 237285 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:25 localhost nova_compute[237281]: 2025-12-06 09:51:25.876 237285 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Dec 6 04:51:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:51:26 localhost systemd[1]: session-45.scope: Deactivated successfully. Dec 6 04:51:26 localhost systemd[1]: session-45.scope: Consumed 1min 30.080s CPU time. Dec 6 04:51:26 localhost systemd-logind[760]: Session 45 logged out. Waiting for processes to exit. Dec 6 04:51:26 localhost systemd-logind[760]: Removed session 45. Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.486 237285 INFO nova.virt.driver [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Dec 6 04:51:26 localhost systemd[1]: tmp-crun.c9mWyj.mount: Deactivated successfully. Dec 6 04:51:26 localhost podman[237509]: 2025-12-06 09:51:26.584282536 +0000 UTC m=+0.106889874 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal) Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.598 237285 INFO nova.compute.provider_config [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.620 237285 DEBUG oslo_concurrency.lockutils [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.621 237285 DEBUG oslo_concurrency.lockutils [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.621 237285 DEBUG oslo_concurrency.lockutils [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.621 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.621 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.621 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.621 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.622 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.622 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.622 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.622 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.622 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.622 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.622 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.623 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.623 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.623 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.623 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.623 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.623 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.623 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] console_host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.624 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.625 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.625 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.625 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.625 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.625 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.625 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.625 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.626 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.626 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.626 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.626 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.626 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.626 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] host = np0005548798.ooo.test log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.626 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.627 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.627 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.627 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.627 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.627 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.627 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.627 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.628 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.629 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.629 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.629 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.629 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.629 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.629 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.629 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.630 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.631 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.632 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.632 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.632 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.632 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.632 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.632 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.632 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.633 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.634 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.634 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.634 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.634 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.634 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.634 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.634 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.635 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.636 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.637 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.638 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.639 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.639 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.639 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.639 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.639 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.639 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.639 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.640 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.641 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.641 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.641 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.641 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.641 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.641 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.641 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.642 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.642 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.642 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.642 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.642 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.642 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.642 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.643 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.644 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.644 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.644 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.644 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.644 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.644 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.644 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.645 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.646 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.646 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.646 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.646 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.646 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.646 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.646 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.647 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.648 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.648 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.648 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.648 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.648 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.648 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.648 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.649 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.650 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.650 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.650 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.650 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.650 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.650 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.650 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.651 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.652 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.653 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.653 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.653 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.653 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.653 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.653 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.653 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.654 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.655 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.655 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.655 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.655 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.655 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.655 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.655 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost podman[237509]: 2025-12-06 09:51:26.656353473 +0000 UTC m=+0.178960771 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, release=1755695350, vcs-type=git, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.656 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.657 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.657 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.657 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.657 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.657 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.657 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.657 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.658 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.658 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.658 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.658 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.658 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.658 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.658 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.659 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.659 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.659 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.659 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.659 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.659 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.659 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.660 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.661 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.661 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.661 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.661 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.661 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.661 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.661 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.662 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.663 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.664 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.664 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.664 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.664 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.664 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.664 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.664 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.665 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.666 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.666 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.666 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.666 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.666 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.666 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.667 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.667 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.667 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.667 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.667 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.667 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.667 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.668 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.669 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.669 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.669 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.669 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.669 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.669 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.670 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.671 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.671 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.671 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.671 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.671 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.671 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.671 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.672 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.673 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.673 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.673 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.673 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.673 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.673 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.673 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.674 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.675 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.675 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.675 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.675 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.675 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.675 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.675 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.676 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.677 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.677 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.677 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.677 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.677 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.677 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.678 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.679 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.679 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.679 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.679 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.679 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.679 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.679 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.680 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.681 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.681 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.681 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.681 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.681 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.681 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.681 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.682 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.682 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.682 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.682 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.682 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.682 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.682 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.683 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.683 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.683 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.683 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.683 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.683 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.683 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.images_rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.684 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.images_rbd_glance_store_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.images_rbd_pool = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.images_type = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.685 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.686 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.686 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.686 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.686 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.686 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.686 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.686 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.687 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.687 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.688 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.688 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.688 237285 WARNING oslo_config.cfg [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Dec 6 04:51:26 localhost nova_compute[237281]: live_migration_uri is deprecated for removal in favor of two other options that Dec 6 04:51:26 localhost nova_compute[237281]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Dec 6 04:51:26 localhost nova_compute[237281]: and ``live_migration_inbound_addr`` respectively. Dec 6 04:51:26 localhost nova_compute[237281]: ). Its value may be silently ignored in the future.#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.688 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_uri = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.688 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.688 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.689 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.689 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.689 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.689 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.689 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.689 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.689 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.690 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.690 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.690 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.690 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.690 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.690 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.690 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.691 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.691 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rbd_secret_uuid = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.691 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rbd_user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.691 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.691 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.691 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.691 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.692 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.692 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.692 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.692 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.692 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.692 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.692 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.693 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.693 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.693 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.693 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.693 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.693 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.693 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.694 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.694 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.694 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.694 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.694 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.694 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.694 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.695 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.696 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.696 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.696 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.696 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.696 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.696 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.696 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.697 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.698 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.698 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.698 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.698 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.698 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.698 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.698 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.699 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.699 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.699 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.699 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.699 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.699 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.699 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.700 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.700 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.700 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.700 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.700 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.700 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.700 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.auth_url = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.701 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.702 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.703 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.703 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.703 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.703 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.703 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.703 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.703 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.704 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.705 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.705 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.705 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.705 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.705 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.706 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.706 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.706 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.706 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.706 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.706 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.706 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.707 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.707 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.707 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.707 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.707 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.707 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.707 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.708 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.708 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.708 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.708 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.708 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.708 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.708 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.709 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.709 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.709 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.709 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.709 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.709 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.709 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.710 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.711 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.711 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.711 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.711 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.711 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.711 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.711 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.712 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.713 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.713 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.713 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.713 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.713 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.713 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.714 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.714 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.714 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.714 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.714 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.714 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.715 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.716 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.716 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.716 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.716 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.716 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.716 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.716 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.717 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.717 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.717 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.717 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.717 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.717 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.717 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.718 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.719 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.719 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.719 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.719 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.719 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.719 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.719 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.720 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.721 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.722 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.722 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.722 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.722 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.722 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.722 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.722 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.723 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.novncproxy_base_url = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.723 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.723 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.723 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.723 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.723 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.723 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.724 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.724 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.724 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.724 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.724 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.724 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.724 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.725 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.726 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.727 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.727 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.727 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.727 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.727 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.727 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.727 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.728 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.728 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.728 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.728 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.728 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.728 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.728 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.729 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.730 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.730 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.730 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.730 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.730 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.730 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.730 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.731 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.732 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.732 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.732 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.732 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.732 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.732 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.732 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.733 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.734 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.734 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.734 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.734 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.734 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.734 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.734 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.auth_url = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.735 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.736 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.737 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.737 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.737 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.737 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.737 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.737 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.737 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.738 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.739 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.740 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.740 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.740 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.740 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.740 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.740 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.740 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.741 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.742 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.742 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.742 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.742 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.742 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.742 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.742 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.743 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.744 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.744 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.744 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.744 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.744 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.744 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.744 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.745 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.746 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.746 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.746 237285 DEBUG oslo_service.service [None req-b39b1df3-f9fa-4b8d-ba96-c9511ee9bbd7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.746 237285 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.759 237285 INFO nova.virt.node [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Determined node identity db8b39ad-af52-43e3-99e2-f3c431f03241 from /var/lib/nova/compute_id#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.759 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.760 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.760 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.760 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.770 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.772 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.773 237285 INFO nova.virt.libvirt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Connection event '1' reason 'None'#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.778 237285 INFO nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Libvirt host capabilities Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 3134f11d-a070-482e-9899-7eb324eccfc9 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: x86_64 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v4 Dec 6 04:51:26 localhost nova_compute[237281]: AMD Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: tcp Dec 6 04:51:26 localhost nova_compute[237281]: rdma Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 16116612 Dec 6 04:51:26 localhost nova_compute[237281]: 4029153 Dec 6 04:51:26 localhost nova_compute[237281]: 0 Dec 6 04:51:26 localhost nova_compute[237281]: 0 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: selinux Dec 6 04:51:26 localhost nova_compute[237281]: 0 Dec 6 04:51:26 localhost nova_compute[237281]: system_u:system_r:svirt_t:s0 Dec 6 04:51:26 localhost nova_compute[237281]: system_u:system_r:svirt_tcg_t:s0 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: dac Dec 6 04:51:26 localhost nova_compute[237281]: 0 Dec 6 04:51:26 localhost nova_compute[237281]: +107:+107 Dec 6 04:51:26 localhost nova_compute[237281]: +107:+107 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: hvm Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 32 Dec 6 04:51:26 localhost nova_compute[237281]: /usr/libexec/qemu-kvm Dec 6 04:51:26 localhost nova_compute[237281]: pc-i440fx-rhel7.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.8.0 Dec 6 04:51:26 localhost nova_compute[237281]: q35 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.4.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.5.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.3.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel7.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.4.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.2.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.2.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.0.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.0.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.1.0 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: hvm Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 64 Dec 6 04:51:26 localhost nova_compute[237281]: /usr/libexec/qemu-kvm Dec 6 04:51:26 localhost nova_compute[237281]: pc-i440fx-rhel7.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.8.0 Dec 6 04:51:26 localhost nova_compute[237281]: q35 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.4.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.5.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.3.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel7.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.4.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.2.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.2.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.0.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.0.0 Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel8.1.0 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: #033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.783 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.784 237285 DEBUG nova.virt.libvirt.volume.mount [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.786 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/libexec/qemu-kvm Dec 6 04:51:26 localhost nova_compute[237281]: kvm Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.8.0 Dec 6 04:51:26 localhost nova_compute[237281]: i686 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: rom Dec 6 04:51:26 localhost nova_compute[237281]: pflash Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: yes Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: AMD Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 486 Dec 6 04:51:26 localhost nova_compute[237281]: 486-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Conroe Dec 6 04:51:26 localhost nova_compute[237281]: Conroe-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-IBPB Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v4 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v1 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v2 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v6 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v7 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Penryn Dec 6 04:51:26 localhost nova_compute[237281]: Penryn-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Westmere Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-v2 Dec 6 04:51:26 localhost nova_compute[237281]: athlon Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: athlon-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: core2duo Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: core2duo-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: coreduo Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: coreduo-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: kvm32 Dec 6 04:51:26 localhost nova_compute[237281]: kvm32-v1 Dec 6 04:51:26 localhost nova_compute[237281]: kvm64 Dec 6 04:51:26 localhost nova_compute[237281]: kvm64-v1 Dec 6 04:51:26 localhost nova_compute[237281]: n270 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: n270-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: pentium Dec 6 04:51:26 localhost nova_compute[237281]: pentium-v1 Dec 6 04:51:26 localhost nova_compute[237281]: pentium2 Dec 6 04:51:26 localhost nova_compute[237281]: pentium2-v1 Dec 6 04:51:26 localhost nova_compute[237281]: pentium3 Dec 6 04:51:26 localhost nova_compute[237281]: pentium3-v1 Dec 6 04:51:26 localhost nova_compute[237281]: phenom Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: phenom-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: qemu32 Dec 6 04:51:26 localhost nova_compute[237281]: qemu32-v1 Dec 6 04:51:26 localhost nova_compute[237281]: qemu64 Dec 6 04:51:26 localhost nova_compute[237281]: qemu64-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: file Dec 6 04:51:26 localhost nova_compute[237281]: anonymous Dec 6 04:51:26 localhost nova_compute[237281]: memfd Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: disk Dec 6 04:51:26 localhost nova_compute[237281]: cdrom Dec 6 04:51:26 localhost nova_compute[237281]: floppy Dec 6 04:51:26 localhost nova_compute[237281]: lun Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: fdc Dec 6 04:51:26 localhost nova_compute[237281]: scsi Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: sata Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:26 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: vnc Dec 6 04:51:26 localhost nova_compute[237281]: egl-headless Dec 6 04:51:26 localhost nova_compute[237281]: dbus Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: subsystem Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: default Dec 6 04:51:26 localhost nova_compute[237281]: mandatory Dec 6 04:51:26 localhost nova_compute[237281]: requisite Dec 6 04:51:26 localhost nova_compute[237281]: optional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: pci Dec 6 04:51:26 localhost nova_compute[237281]: scsi Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:26 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: random Dec 6 04:51:26 localhost nova_compute[237281]: egd Dec 6 04:51:26 localhost nova_compute[237281]: builtin Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: path Dec 6 04:51:26 localhost nova_compute[237281]: handle Dec 6 04:51:26 localhost nova_compute[237281]: virtiofs Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: tpm-tis Dec 6 04:51:26 localhost nova_compute[237281]: tpm-crb Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: emulator Dec 6 04:51:26 localhost nova_compute[237281]: external Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 2.0 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: pty Dec 6 04:51:26 localhost nova_compute[237281]: unix Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: qemu Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: builtin Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: default Dec 6 04:51:26 localhost nova_compute[237281]: passt Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: isa Dec 6 04:51:26 localhost nova_compute[237281]: hyperv Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: null Dec 6 04:51:26 localhost nova_compute[237281]: vc Dec 6 04:51:26 localhost nova_compute[237281]: pty Dec 6 04:51:26 localhost nova_compute[237281]: dev Dec 6 04:51:26 localhost nova_compute[237281]: file Dec 6 04:51:26 localhost nova_compute[237281]: pipe Dec 6 04:51:26 localhost nova_compute[237281]: stdio Dec 6 04:51:26 localhost nova_compute[237281]: udp Dec 6 04:51:26 localhost nova_compute[237281]: tcp Dec 6 04:51:26 localhost nova_compute[237281]: unix Dec 6 04:51:26 localhost nova_compute[237281]: qemu-vdagent Dec 6 04:51:26 localhost nova_compute[237281]: dbus Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: relaxed Dec 6 04:51:26 localhost nova_compute[237281]: vapic Dec 6 04:51:26 localhost nova_compute[237281]: spinlocks Dec 6 04:51:26 localhost nova_compute[237281]: vpindex Dec 6 04:51:26 localhost nova_compute[237281]: runtime Dec 6 04:51:26 localhost nova_compute[237281]: synic Dec 6 04:51:26 localhost nova_compute[237281]: stimer Dec 6 04:51:26 localhost nova_compute[237281]: reset Dec 6 04:51:26 localhost nova_compute[237281]: vendor_id Dec 6 04:51:26 localhost nova_compute[237281]: frequencies Dec 6 04:51:26 localhost nova_compute[237281]: reenlightenment Dec 6 04:51:26 localhost nova_compute[237281]: tlbflush Dec 6 04:51:26 localhost nova_compute[237281]: ipi Dec 6 04:51:26 localhost nova_compute[237281]: avic Dec 6 04:51:26 localhost nova_compute[237281]: emsr_bitmap Dec 6 04:51:26 localhost nova_compute[237281]: xmm_input Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 4095 Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Linux KVM Hv Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: tdx Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.794 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/libexec/qemu-kvm Dec 6 04:51:26 localhost nova_compute[237281]: kvm Dec 6 04:51:26 localhost nova_compute[237281]: pc-i440fx-rhel7.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: i686 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: rom Dec 6 04:51:26 localhost nova_compute[237281]: pflash Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: yes Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: AMD Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 486 Dec 6 04:51:26 localhost nova_compute[237281]: 486-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Conroe Dec 6 04:51:26 localhost nova_compute[237281]: Conroe-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-IBPB Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v4 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v1 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v2 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v6 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v7 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Penryn Dec 6 04:51:26 localhost nova_compute[237281]: Penryn-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Westmere Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-v2 Dec 6 04:51:26 localhost nova_compute[237281]: athlon Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: athlon-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: core2duo Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: core2duo-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: coreduo Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: coreduo-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: kvm32 Dec 6 04:51:26 localhost nova_compute[237281]: kvm32-v1 Dec 6 04:51:26 localhost nova_compute[237281]: kvm64 Dec 6 04:51:26 localhost nova_compute[237281]: kvm64-v1 Dec 6 04:51:26 localhost nova_compute[237281]: n270 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: n270-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: pentium Dec 6 04:51:26 localhost nova_compute[237281]: pentium-v1 Dec 6 04:51:26 localhost nova_compute[237281]: pentium2 Dec 6 04:51:26 localhost nova_compute[237281]: pentium2-v1 Dec 6 04:51:26 localhost nova_compute[237281]: pentium3 Dec 6 04:51:26 localhost nova_compute[237281]: pentium3-v1 Dec 6 04:51:26 localhost nova_compute[237281]: phenom Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: phenom-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: qemu32 Dec 6 04:51:26 localhost nova_compute[237281]: qemu32-v1 Dec 6 04:51:26 localhost nova_compute[237281]: qemu64 Dec 6 04:51:26 localhost nova_compute[237281]: qemu64-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: file Dec 6 04:51:26 localhost nova_compute[237281]: anonymous Dec 6 04:51:26 localhost nova_compute[237281]: memfd Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: disk Dec 6 04:51:26 localhost nova_compute[237281]: cdrom Dec 6 04:51:26 localhost nova_compute[237281]: floppy Dec 6 04:51:26 localhost nova_compute[237281]: lun Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: ide Dec 6 04:51:26 localhost nova_compute[237281]: fdc Dec 6 04:51:26 localhost nova_compute[237281]: scsi Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: sata Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:26 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: vnc Dec 6 04:51:26 localhost nova_compute[237281]: egl-headless Dec 6 04:51:26 localhost nova_compute[237281]: dbus Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: subsystem Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: default Dec 6 04:51:26 localhost nova_compute[237281]: mandatory Dec 6 04:51:26 localhost nova_compute[237281]: requisite Dec 6 04:51:26 localhost nova_compute[237281]: optional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: pci Dec 6 04:51:26 localhost nova_compute[237281]: scsi Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:26 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: random Dec 6 04:51:26 localhost nova_compute[237281]: egd Dec 6 04:51:26 localhost nova_compute[237281]: builtin Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: path Dec 6 04:51:26 localhost nova_compute[237281]: handle Dec 6 04:51:26 localhost nova_compute[237281]: virtiofs Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: tpm-tis Dec 6 04:51:26 localhost nova_compute[237281]: tpm-crb Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: emulator Dec 6 04:51:26 localhost nova_compute[237281]: external Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 2.0 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: pty Dec 6 04:51:26 localhost nova_compute[237281]: unix Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: qemu Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: builtin Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: default Dec 6 04:51:26 localhost nova_compute[237281]: passt Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: isa Dec 6 04:51:26 localhost nova_compute[237281]: hyperv Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: null Dec 6 04:51:26 localhost nova_compute[237281]: vc Dec 6 04:51:26 localhost nova_compute[237281]: pty Dec 6 04:51:26 localhost nova_compute[237281]: dev Dec 6 04:51:26 localhost nova_compute[237281]: file Dec 6 04:51:26 localhost nova_compute[237281]: pipe Dec 6 04:51:26 localhost nova_compute[237281]: stdio Dec 6 04:51:26 localhost nova_compute[237281]: udp Dec 6 04:51:26 localhost nova_compute[237281]: tcp Dec 6 04:51:26 localhost nova_compute[237281]: unix Dec 6 04:51:26 localhost nova_compute[237281]: qemu-vdagent Dec 6 04:51:26 localhost nova_compute[237281]: dbus Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: relaxed Dec 6 04:51:26 localhost nova_compute[237281]: vapic Dec 6 04:51:26 localhost nova_compute[237281]: spinlocks Dec 6 04:51:26 localhost nova_compute[237281]: vpindex Dec 6 04:51:26 localhost nova_compute[237281]: runtime Dec 6 04:51:26 localhost nova_compute[237281]: synic Dec 6 04:51:26 localhost nova_compute[237281]: stimer Dec 6 04:51:26 localhost nova_compute[237281]: reset Dec 6 04:51:26 localhost nova_compute[237281]: vendor_id Dec 6 04:51:26 localhost nova_compute[237281]: frequencies Dec 6 04:51:26 localhost nova_compute[237281]: reenlightenment Dec 6 04:51:26 localhost nova_compute[237281]: tlbflush Dec 6 04:51:26 localhost nova_compute[237281]: ipi Dec 6 04:51:26 localhost nova_compute[237281]: avic Dec 6 04:51:26 localhost nova_compute[237281]: emsr_bitmap Dec 6 04:51:26 localhost nova_compute[237281]: xmm_input Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 4095 Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Linux KVM Hv Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: tdx Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.823 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.826 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/libexec/qemu-kvm Dec 6 04:51:26 localhost nova_compute[237281]: kvm Dec 6 04:51:26 localhost nova_compute[237281]: pc-q35-rhel9.8.0 Dec 6 04:51:26 localhost nova_compute[237281]: x86_64 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: efi Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Dec 6 04:51:26 localhost nova_compute[237281]: /usr/share/edk2/ovmf/OVMF_CODE.fd Dec 6 04:51:26 localhost nova_compute[237281]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Dec 6 04:51:26 localhost nova_compute[237281]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: rom Dec 6 04:51:26 localhost nova_compute[237281]: pflash Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: yes Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: yes Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: AMD Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 486 Dec 6 04:51:26 localhost nova_compute[237281]: 486-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Conroe Dec 6 04:51:26 localhost nova_compute[237281]: Conroe-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-IBPB Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v4 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v1 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v2 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v6 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v7 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Penryn Dec 6 04:51:26 localhost nova_compute[237281]: Penryn-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Snowridge-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Westmere Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Westmere-v2 Dec 6 04:51:26 localhost nova_compute[237281]: athlon Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: athlon-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: core2duo Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: core2duo-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: coreduo Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: coreduo-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: kvm32 Dec 6 04:51:26 localhost nova_compute[237281]: kvm32-v1 Dec 6 04:51:26 localhost nova_compute[237281]: kvm64 Dec 6 04:51:26 localhost nova_compute[237281]: kvm64-v1 Dec 6 04:51:26 localhost nova_compute[237281]: n270 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: n270-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: pentium Dec 6 04:51:26 localhost nova_compute[237281]: pentium-v1 Dec 6 04:51:26 localhost nova_compute[237281]: pentium2 Dec 6 04:51:26 localhost nova_compute[237281]: pentium2-v1 Dec 6 04:51:26 localhost nova_compute[237281]: pentium3 Dec 6 04:51:26 localhost nova_compute[237281]: pentium3-v1 Dec 6 04:51:26 localhost nova_compute[237281]: phenom Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: phenom-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: qemu32 Dec 6 04:51:26 localhost nova_compute[237281]: qemu32-v1 Dec 6 04:51:26 localhost nova_compute[237281]: qemu64 Dec 6 04:51:26 localhost nova_compute[237281]: qemu64-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: file Dec 6 04:51:26 localhost nova_compute[237281]: anonymous Dec 6 04:51:26 localhost nova_compute[237281]: memfd Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: disk Dec 6 04:51:26 localhost nova_compute[237281]: cdrom Dec 6 04:51:26 localhost nova_compute[237281]: floppy Dec 6 04:51:26 localhost nova_compute[237281]: lun Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: fdc Dec 6 04:51:26 localhost nova_compute[237281]: scsi Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: sata Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:26 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: vnc Dec 6 04:51:26 localhost nova_compute[237281]: egl-headless Dec 6 04:51:26 localhost nova_compute[237281]: dbus Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: subsystem Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: default Dec 6 04:51:26 localhost nova_compute[237281]: mandatory Dec 6 04:51:26 localhost nova_compute[237281]: requisite Dec 6 04:51:26 localhost nova_compute[237281]: optional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: pci Dec 6 04:51:26 localhost nova_compute[237281]: scsi Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: virtio Dec 6 04:51:26 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:26 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: random Dec 6 04:51:26 localhost nova_compute[237281]: egd Dec 6 04:51:26 localhost nova_compute[237281]: builtin Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: path Dec 6 04:51:26 localhost nova_compute[237281]: handle Dec 6 04:51:26 localhost nova_compute[237281]: virtiofs Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: tpm-tis Dec 6 04:51:26 localhost nova_compute[237281]: tpm-crb Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: emulator Dec 6 04:51:26 localhost nova_compute[237281]: external Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 2.0 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: usb Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: pty Dec 6 04:51:26 localhost nova_compute[237281]: unix Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: qemu Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: builtin Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: default Dec 6 04:51:26 localhost nova_compute[237281]: passt Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: isa Dec 6 04:51:26 localhost nova_compute[237281]: hyperv Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: null Dec 6 04:51:26 localhost nova_compute[237281]: vc Dec 6 04:51:26 localhost nova_compute[237281]: pty Dec 6 04:51:26 localhost nova_compute[237281]: dev Dec 6 04:51:26 localhost nova_compute[237281]: file Dec 6 04:51:26 localhost nova_compute[237281]: pipe Dec 6 04:51:26 localhost nova_compute[237281]: stdio Dec 6 04:51:26 localhost nova_compute[237281]: udp Dec 6 04:51:26 localhost nova_compute[237281]: tcp Dec 6 04:51:26 localhost nova_compute[237281]: unix Dec 6 04:51:26 localhost nova_compute[237281]: qemu-vdagent Dec 6 04:51:26 localhost nova_compute[237281]: dbus Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: relaxed Dec 6 04:51:26 localhost nova_compute[237281]: vapic Dec 6 04:51:26 localhost nova_compute[237281]: spinlocks Dec 6 04:51:26 localhost nova_compute[237281]: vpindex Dec 6 04:51:26 localhost nova_compute[237281]: runtime Dec 6 04:51:26 localhost nova_compute[237281]: synic Dec 6 04:51:26 localhost nova_compute[237281]: stimer Dec 6 04:51:26 localhost nova_compute[237281]: reset Dec 6 04:51:26 localhost nova_compute[237281]: vendor_id Dec 6 04:51:26 localhost nova_compute[237281]: frequencies Dec 6 04:51:26 localhost nova_compute[237281]: reenlightenment Dec 6 04:51:26 localhost nova_compute[237281]: tlbflush Dec 6 04:51:26 localhost nova_compute[237281]: ipi Dec 6 04:51:26 localhost nova_compute[237281]: avic Dec 6 04:51:26 localhost nova_compute[237281]: emsr_bitmap Dec 6 04:51:26 localhost nova_compute[237281]: xmm_input Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 4095 Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Linux KVM Hv Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: tdx Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:51:26 localhost nova_compute[237281]: 2025-12-06 09:51:26.880 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/libexec/qemu-kvm Dec 6 04:51:26 localhost nova_compute[237281]: kvm Dec 6 04:51:26 localhost nova_compute[237281]: pc-i440fx-rhel7.6.0 Dec 6 04:51:26 localhost nova_compute[237281]: x86_64 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: rom Dec 6 04:51:26 localhost nova_compute[237281]: pflash Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: yes Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: no Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: on Dec 6 04:51:26 localhost nova_compute[237281]: off Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: AMD Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: 486 Dec 6 04:51:26 localhost nova_compute[237281]: 486-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Broadwell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cascadelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Conroe Dec 6 04:51:26 localhost nova_compute[237281]: Conroe-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Cooperlake-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Denverton-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dhyana-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Genoa-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-IBPB Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Milan-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-Rome-v4 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v1 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v2 Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: EPYC-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: GraniteRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Haswell-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-noTSX Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v6 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Icelake-Server-v7 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: IvyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: KnightsMill-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Nehalem-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G1-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G2-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G3-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G4-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Opteron_G5-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Penryn Dec 6 04:51:26 localhost nova_compute[237281]: Penryn-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v1 Dec 6 04:51:26 localhost nova_compute[237281]: SandyBridge-v2 Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v2 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SapphireRapids-v3 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: SierraForest-v1 Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Skylake-Client-noTSX-IBRS Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:26 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Client-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Client-v2 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Client-v3 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Client-v4 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server-IBRS Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server-noTSX-IBRS Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server-v2 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server-v3 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server-v4 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Skylake-Server-v5 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Snowridge Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Snowridge-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Snowridge-v2 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Snowridge-v3 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Snowridge-v4 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Westmere Dec 6 04:51:27 localhost nova_compute[237281]: Westmere-IBRS Dec 6 04:51:27 localhost nova_compute[237281]: Westmere-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Westmere-v2 Dec 6 04:51:27 localhost nova_compute[237281]: athlon Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: athlon-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: core2duo Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: core2duo-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: coreduo Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: coreduo-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: kvm32 Dec 6 04:51:27 localhost nova_compute[237281]: kvm32-v1 Dec 6 04:51:27 localhost nova_compute[237281]: kvm64 Dec 6 04:51:27 localhost nova_compute[237281]: kvm64-v1 Dec 6 04:51:27 localhost nova_compute[237281]: n270 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: n270-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: pentium Dec 6 04:51:27 localhost nova_compute[237281]: pentium-v1 Dec 6 04:51:27 localhost nova_compute[237281]: pentium2 Dec 6 04:51:27 localhost nova_compute[237281]: pentium2-v1 Dec 6 04:51:27 localhost nova_compute[237281]: pentium3 Dec 6 04:51:27 localhost nova_compute[237281]: pentium3-v1 Dec 6 04:51:27 localhost nova_compute[237281]: phenom Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: phenom-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: qemu32 Dec 6 04:51:27 localhost nova_compute[237281]: qemu32-v1 Dec 6 04:51:27 localhost nova_compute[237281]: qemu64 Dec 6 04:51:27 localhost nova_compute[237281]: qemu64-v1 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: file Dec 6 04:51:27 localhost nova_compute[237281]: anonymous Dec 6 04:51:27 localhost nova_compute[237281]: memfd Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: disk Dec 6 04:51:27 localhost nova_compute[237281]: cdrom Dec 6 04:51:27 localhost nova_compute[237281]: floppy Dec 6 04:51:27 localhost nova_compute[237281]: lun Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: ide Dec 6 04:51:27 localhost nova_compute[237281]: fdc Dec 6 04:51:27 localhost nova_compute[237281]: scsi Dec 6 04:51:27 localhost nova_compute[237281]: virtio Dec 6 04:51:27 localhost nova_compute[237281]: usb Dec 6 04:51:27 localhost nova_compute[237281]: sata Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: virtio Dec 6 04:51:27 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:27 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: vnc Dec 6 04:51:27 localhost nova_compute[237281]: egl-headless Dec 6 04:51:27 localhost nova_compute[237281]: dbus Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: subsystem Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: default Dec 6 04:51:27 localhost nova_compute[237281]: mandatory Dec 6 04:51:27 localhost nova_compute[237281]: requisite Dec 6 04:51:27 localhost nova_compute[237281]: optional Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: usb Dec 6 04:51:27 localhost nova_compute[237281]: pci Dec 6 04:51:27 localhost nova_compute[237281]: scsi Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: virtio Dec 6 04:51:27 localhost nova_compute[237281]: virtio-transitional Dec 6 04:51:27 localhost nova_compute[237281]: virtio-non-transitional Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: random Dec 6 04:51:27 localhost nova_compute[237281]: egd Dec 6 04:51:27 localhost nova_compute[237281]: builtin Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: path Dec 6 04:51:27 localhost nova_compute[237281]: handle Dec 6 04:51:27 localhost nova_compute[237281]: virtiofs Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: tpm-tis Dec 6 04:51:27 localhost nova_compute[237281]: tpm-crb Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: emulator Dec 6 04:51:27 localhost nova_compute[237281]: external Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: 2.0 Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: usb Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: pty Dec 6 04:51:27 localhost nova_compute[237281]: unix Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: qemu Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: builtin Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: default Dec 6 04:51:27 localhost nova_compute[237281]: passt Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: isa Dec 6 04:51:27 localhost nova_compute[237281]: hyperv Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: null Dec 6 04:51:27 localhost nova_compute[237281]: vc Dec 6 04:51:27 localhost nova_compute[237281]: pty Dec 6 04:51:27 localhost nova_compute[237281]: dev Dec 6 04:51:27 localhost nova_compute[237281]: file Dec 6 04:51:27 localhost nova_compute[237281]: pipe Dec 6 04:51:27 localhost nova_compute[237281]: stdio Dec 6 04:51:27 localhost nova_compute[237281]: udp Dec 6 04:51:27 localhost nova_compute[237281]: tcp Dec 6 04:51:27 localhost nova_compute[237281]: unix Dec 6 04:51:27 localhost nova_compute[237281]: qemu-vdagent Dec 6 04:51:27 localhost nova_compute[237281]: dbus Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: relaxed Dec 6 04:51:27 localhost nova_compute[237281]: vapic Dec 6 04:51:27 localhost nova_compute[237281]: spinlocks Dec 6 04:51:27 localhost nova_compute[237281]: vpindex Dec 6 04:51:27 localhost nova_compute[237281]: runtime Dec 6 04:51:27 localhost nova_compute[237281]: synic Dec 6 04:51:27 localhost nova_compute[237281]: stimer Dec 6 04:51:27 localhost nova_compute[237281]: reset Dec 6 04:51:27 localhost nova_compute[237281]: vendor_id Dec 6 04:51:27 localhost nova_compute[237281]: frequencies Dec 6 04:51:27 localhost nova_compute[237281]: reenlightenment Dec 6 04:51:27 localhost nova_compute[237281]: tlbflush Dec 6 04:51:27 localhost nova_compute[237281]: ipi Dec 6 04:51:27 localhost nova_compute[237281]: avic Dec 6 04:51:27 localhost nova_compute[237281]: emsr_bitmap Dec 6 04:51:27 localhost nova_compute[237281]: xmm_input Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: 4095 Dec 6 04:51:27 localhost nova_compute[237281]: on Dec 6 04:51:27 localhost nova_compute[237281]: off Dec 6 04:51:27 localhost nova_compute[237281]: off Dec 6 04:51:27 localhost nova_compute[237281]: Linux KVM Hv Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: tdx Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: Dec 6 04:51:27 localhost nova_compute[237281]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:26.937 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:26.938 237285 INFO nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Secure Boot support detected#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:26.939 237285 INFO nova.virt.libvirt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:26.939 237285 INFO nova.virt.libvirt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:26.947 237285 DEBUG nova.virt.libvirt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:26.975 237285 INFO nova.virt.node [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Determined node identity db8b39ad-af52-43e3-99e2-f3c431f03241 from /var/lib/nova/compute_id#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:26.987 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Verified node db8b39ad-af52-43e3-99e2-f3c431f03241 matches my host np0005548798.ooo.test _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.011 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.014 237285 DEBUG nova.virt.libvirt.vif [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:38:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005548798.ooo.test',hostname='test',id=2,image_ref='c6562616-bf77-48e6-bb05-431e64af083a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T08:38:42Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005548798.ooo.test',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='47835b89168945138751a4b216280589',ramdisk_id='',reservation_id='r-h8mij0z5',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-12-06T08:38:43Z,user_data=None,user_id='5220ceda9e4145d395f52fc9fd0365c0',uuid=a5070ada-6b60-4992-a1bf-9e83aaccac93,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.014 237285 DEBUG nova.network.os_vif_util [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Converting VIF {"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.014 237285 DEBUG nova.network.os_vif_util [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.015 237285 DEBUG os_vif [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.039 237285 DEBUG ovsdbapp.backend.ovs_idl [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.039 237285 DEBUG ovsdbapp.backend.ovs_idl [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.039 237285 DEBUG ovsdbapp.backend.ovs_idl [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.039 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.040 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.040 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.040 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.041 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.044 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.055 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.056 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.056 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.057 237285 INFO oslo.privsep.daemon [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpkom167l7/privsep.sock']#033[00m Dec 6 04:51:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14389 DF PROTO=TCP SPT=50264 DPT=9102 SEQ=2109830551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC10D870000000001030307) Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.653 237285 INFO oslo.privsep.daemon [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.542 237554 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.547 237554 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.551 237554 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.551 237554 INFO oslo.privsep.daemon [-] privsep daemon running as pid 237554#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.930 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.931 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap227fe5b2-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.931 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap227fe5b2-a5, col_values=(('external_ids', {'iface-id': '227fe5b2-a5a7-4043-b641-32b6e7c7a7c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:02:64', 'vm-uuid': 'a5070ada-6b60-4992-a1bf-9e83aaccac93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.932 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.933 237285 INFO os_vif [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5')#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.934 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.937 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Dec 6 04:51:27 localhost nova_compute[237281]: 2025-12-06 09:51:27.938 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.024 237285 DEBUG oslo_concurrency.lockutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.024 237285 DEBUG oslo_concurrency.lockutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.025 237285 DEBUG oslo_concurrency.lockutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.025 237285 DEBUG nova.compute.resource_tracker [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:51:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17655 DF PROTO=TCP SPT=54084 DPT=9102 SEQ=3369364799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC111870000000001030307) Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.144 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.215 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.216 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.290 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.291 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.360 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.361 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.431 237285 DEBUG oslo_concurrency.processutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:51:28 localhost podman[237570]: 2025-12-06 09:51:28.544883455 +0000 UTC m=+0.078865000 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:51:28 localhost podman[237570]: 2025-12-06 09:51:28.579202615 +0000 UTC m=+0.113184160 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:51:28 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.628 237285 WARNING nova.virt.libvirt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.630 237285 DEBUG nova.compute.resource_tracker [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12622MB free_disk=387.3064384460449GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.630 237285 DEBUG oslo_concurrency.lockutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.630 237285 DEBUG oslo_concurrency.lockutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.795 237285 DEBUG nova.compute.resource_tracker [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.795 237285 DEBUG nova.compute.resource_tracker [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.795 237285 DEBUG nova.compute.resource_tracker [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.905 237285 DEBUG nova.scheduler.client.report [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.928 237285 DEBUG nova.scheduler.client.report [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.929 237285 DEBUG nova.compute.provider_tree [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.947 237285 DEBUG nova.scheduler.client.report [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 04:51:28 localhost nova_compute[237281]: 2025-12-06 09:51:28.970 237285 DEBUG nova.scheduler.client.report [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.014 237285 DEBUG nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Dec 6 04:51:29 localhost nova_compute[237281]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.015 237285 INFO nova.virt.libvirt.host [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] kernel doesn't support AMD SEV#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.016 237285 DEBUG nova.compute.provider_tree [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.017 237285 DEBUG nova.virt.libvirt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.039 237285 DEBUG nova.scheduler.client.report [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.063 237285 DEBUG nova.compute.resource_tracker [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.064 237285 DEBUG oslo_concurrency.lockutils [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.064 237285 DEBUG nova.service [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.115 237285 DEBUG nova.service [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Dec 6 04:51:29 localhost nova_compute[237281]: 2025-12-06 09:51:29.116 237285 DEBUG nova.servicegroup.drivers.db [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] DB_Driver: join new ServiceGroup member np0005548798.ooo.test to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Dec 6 04:51:30 localhost nova_compute[237281]: 2025-12-06 09:51:30.880 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14390 DF PROTO=TCP SPT=50264 DPT=9102 SEQ=2109830551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC11D470000000001030307) Dec 6 04:51:32 localhost nova_compute[237281]: 2025-12-06 09:51:32.044 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:35 localhost nova_compute[237281]: 2025-12-06 09:51:35.885 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:37 localhost nova_compute[237281]: 2025-12-06 09:51:37.047 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:51:38 localhost podman[237593]: 2025-12-06 09:51:38.543109898 +0000 UTC m=+0.078748976 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 04:51:38 localhost podman[237593]: 2025-12-06 09:51:38.611352997 +0000 UTC m=+0.146992115 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:51:38 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:51:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14391 DF PROTO=TCP SPT=50264 DPT=9102 SEQ=2109830551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC13D880000000001030307) Dec 6 04:51:40 localhost nova_compute[237281]: 2025-12-06 09:51:40.886 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:42 localhost nova_compute[237281]: 2025-12-06 09:51:42.050 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:51:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:51:42 localhost podman[237620]: 2025-12-06 09:51:42.550679651 +0000 UTC m=+0.080928875 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:51:42 localhost podman[237620]: 2025-12-06 09:51:42.562162298 +0000 UTC m=+0.092411502 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 04:51:42 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:51:42 localhost podman[237619]: 2025-12-06 09:51:42.65167949 +0000 UTC m=+0.184907517 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:51:42 localhost podman[237619]: 2025-12-06 09:51:42.661084212 +0000 UTC m=+0.194312259 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:51:42 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:51:45 localhost nova_compute[237281]: 2025-12-06 09:51:45.890 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:46 localhost openstack_network_exporter[199751]: ERROR 09:51:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:51:46 localhost openstack_network_exporter[199751]: ERROR 09:51:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:51:46 localhost openstack_network_exporter[199751]: ERROR 09:51:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:51:46 localhost openstack_network_exporter[199751]: ERROR 09:51:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:51:46 localhost openstack_network_exporter[199751]: Dec 6 04:51:46 localhost openstack_network_exporter[199751]: ERROR 09:51:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:51:46 localhost openstack_network_exporter[199751]: Dec 6 04:51:47 localhost nova_compute[237281]: 2025-12-06 09:51:47.052 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:51:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:51:47 localhost nova_compute[237281]: 2025-12-06 09:51:47.489 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:47 localhost ovn_metadata_agent[137254]: 2025-12-06 09:51:47.490 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 04:51:47 localhost ovn_metadata_agent[137254]: 2025-12-06 09:51:47.492 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 04:51:47 localhost podman[237663]: 2025-12-06 09:51:47.56244591 +0000 UTC m=+0.085940591 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:51:47 localhost podman[237663]: 2025-12-06 09:51:47.597569914 +0000 UTC m=+0.121064575 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:51:47 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:51:47 localhost podman[237664]: 2025-12-06 09:51:47.615890546 +0000 UTC m=+0.135746154 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:51:47 localhost podman[237664]: 2025-12-06 09:51:47.626083043 +0000 UTC m=+0.145938651 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:51:47 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:51:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:51:49.495 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:51:50 localhost nova_compute[237281]: 2025-12-06 09:51:50.893 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:52 localhost nova_compute[237281]: 2025-12-06 09:51:52.054 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:53 localhost podman[197801]: time="2025-12-06T09:51:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:51:53 localhost podman[197801]: @ - - [06/Dec/2025:09:51:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 143413 "" "Go-http-client/1.1" Dec 6 04:51:53 localhost podman[197801]: @ - - [06/Dec/2025:09:51:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15443 "" "Go-http-client/1.1" Dec 6 04:51:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54461 DF PROTO=TCP SPT=39484 DPT=9102 SEQ=3030395745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1769A0000000001030307) Dec 6 04:51:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54462 DF PROTO=TCP SPT=39484 DPT=9102 SEQ=3030395745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC17A870000000001030307) Dec 6 04:51:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14392 DF PROTO=TCP SPT=50264 DPT=9102 SEQ=2109830551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC17D870000000001030307) Dec 6 04:51:55 localhost nova_compute[237281]: 2025-12-06 09:51:55.896 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54463 DF PROTO=TCP SPT=39484 DPT=9102 SEQ=3030395745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC182870000000001030307) Dec 6 04:51:57 localhost nova_compute[237281]: 2025-12-06 09:51:57.056 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:51:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:51:57 localhost podman[237701]: 2025-12-06 09:51:57.554510255 +0000 UTC m=+0.083845457 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, architecture=x86_64, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible) Dec 6 04:51:57 localhost podman[237701]: 2025-12-06 09:51:57.564936603 +0000 UTC m=+0.094271805 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 04:51:57 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:51:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10251 DF PROTO=TCP SPT=39214 DPT=9102 SEQ=54658559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC185880000000001030307) Dec 6 04:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:51:59 localhost podman[237723]: 2025-12-06 09:51:59.542265206 +0000 UTC m=+0.079892726 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:51:59 localhost podman[237723]: 2025-12-06 09:51:59.547833986 +0000 UTC m=+0.085461496 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:51:59 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:52:00 localhost nova_compute[237281]: 2025-12-06 09:52:00.899 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54464 DF PROTO=TCP SPT=39484 DPT=9102 SEQ=3030395745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC192470000000001030307) Dec 6 04:52:02 localhost nova_compute[237281]: 2025-12-06 09:52:02.058 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:02 localhost nova_compute[237281]: 2025-12-06 09:52:02.557 237285 DEBUG nova.compute.manager [None req-9b742586-e8c0-418a-985c-f0b984ab13ca 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:02 localhost nova_compute[237281]: 2025-12-06 09:52:02.562 237285 INFO nova.compute.manager [None req-9b742586-e8c0-418a-985c-f0b984ab13ca 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Retrieving diagnostics#033[00m Dec 6 04:52:05 localhost nova_compute[237281]: 2025-12-06 09:52:05.902 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:06 localhost nova_compute[237281]: 2025-12-06 09:52:06.119 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:06 localhost nova_compute[237281]: 2025-12-06 09:52:06.142 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Triggering sync for uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 6 04:52:06 localhost nova_compute[237281]: 2025-12-06 09:52:06.143 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:06 localhost nova_compute[237281]: 2025-12-06 09:52:06.144 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:06 localhost nova_compute[237281]: 2025-12-06 09:52:06.144 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:06 localhost nova_compute[237281]: 2025-12-06 09:52:06.178 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:06.672 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:06.673 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:06.674 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:07 localhost nova_compute[237281]: 2025-12-06 09:52:07.060 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:52:09 localhost podman[237748]: 2025-12-06 09:52:09.548102005 +0000 UTC m=+0.079981588 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 04:52:09 localhost podman[237748]: 2025-12-06 09:52:09.588246463 +0000 UTC m=+0.120126036 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:52:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54465 DF PROTO=TCP SPT=39484 DPT=9102 SEQ=3030395745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1B3880000000001030307) Dec 6 04:52:09 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:52:10 localhost nova_compute[237281]: 2025-12-06 09:52:10.750 237285 DEBUG oslo_concurrency.lockutils [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:10 localhost nova_compute[237281]: 2025-12-06 09:52:10.751 237285 DEBUG oslo_concurrency.lockutils [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:10 localhost nova_compute[237281]: 2025-12-06 09:52:10.751 237285 DEBUG nova.compute.manager [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:10 localhost nova_compute[237281]: 2025-12-06 09:52:10.757 237285 DEBUG nova.compute.manager [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Dec 6 04:52:10 localhost nova_compute[237281]: 2025-12-06 09:52:10.761 237285 DEBUG nova.objects.instance [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'flavor' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:10 localhost nova_compute[237281]: 2025-12-06 09:52:10.801 237285 DEBUG nova.virt.libvirt.driver [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Dec 6 04:52:10 localhost nova_compute[237281]: 2025-12-06 09:52:10.904 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:12 localhost nova_compute[237281]: 2025-12-06 09:52:12.062 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost kernel: device tap227fe5b2-a5 left promiscuous mode Dec 6 04:52:13 localhost NetworkManager[5965]: [1765014733.1960] device (tap227fe5b2-a5): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00048|binding|INFO|Releasing lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 from this chassis (sb_readonly=0) Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00049|binding|INFO|Setting lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 down in Southbound Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00050|binding|INFO|Removing iface tap227fe5b2-a5 ovn-installed in OVS Dec 6 04:52:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.242 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:52:13 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Dec 6 04:52:13 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 52.105s CPU time. Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00051|ovn_bfd|INFO|Disabled BFD on interface ovn-d3c7df-0 Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-ded858-0 Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-719bf6-0 Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.254 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00054|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:52:13 localhost systemd-machined[68273]: Machine qemu-1-instance-00000002 terminated. Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.253 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:02:64 192.168.0.189'], port_security=['fa:16:3e:91:02:64 192.168.0.189'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.189/24', 'neutron:device_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005548798.ooo.test', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '47835b89168945138751a4b216280589', 'neutron:revision_number': '7', 'neutron:security_group_ids': '2bda9e92-c0a1-4c1d-90ae-f2e7495954f8 db4a6c1e-fda3-423f-866c-b4772bef83b6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=66aef1d5-ef14-49e3-b4b5-f1e89f0f9ee8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.257 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 in datapath 20509a6a-c438-4c5e-82a7-fe0ea272b309 unbound from our chassis#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.261 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20509a6a-c438-4c5e-82a7-fe0ea272b309, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.260 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.262 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.268 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9ced868e-dba7-4ac0-9005-aecfeb89cb09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.270 137259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309 namespace which is not needed anymore#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.272 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost ovn_controller[131684]: 2025-12-06T09:52:13Z|00055|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.301 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost podman[237783]: 2025-12-06 09:52:13.344421765 +0000 UTC m=+0.091732668 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 04:52:13 localhost podman[237782]: 2025-12-06 09:52:13.389657159 +0000 UTC m=+0.134683792 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:52:13 localhost podman[237783]: 2025-12-06 09:52:13.423796814 +0000 UTC m=+0.171107687 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 04:52:13 localhost systemd[1]: libpod-750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565.scope: Deactivated successfully. Dec 6 04:52:13 localhost podman[237831]: 2025-12-06 09:52:13.432579484 +0000 UTC m=+0.059859344 container died 750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 6 04:52:13 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:52:13 localhost podman[237782]: 2025-12-06 09:52:13.474215147 +0000 UTC m=+0.219241740 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.551 237285 DEBUG nova.compute.manager [req-c9a58364-1111-48b9-8ce2-f761045e945c req-90ee36cc-796a-4eb8-b4e7-5a4b051cfb14 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received event network-vif-unplugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.552 237285 DEBUG oslo_concurrency.lockutils [req-c9a58364-1111-48b9-8ce2-f761045e945c req-90ee36cc-796a-4eb8-b4e7-5a4b051cfb14 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.552 237285 DEBUG oslo_concurrency.lockutils [req-c9a58364-1111-48b9-8ce2-f761045e945c req-90ee36cc-796a-4eb8-b4e7-5a4b051cfb14 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.552 237285 DEBUG oslo_concurrency.lockutils [req-c9a58364-1111-48b9-8ce2-f761045e945c req-90ee36cc-796a-4eb8-b4e7-5a4b051cfb14 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.553 237285 DEBUG nova.compute.manager [req-c9a58364-1111-48b9-8ce2-f761045e945c req-90ee36cc-796a-4eb8-b4e7-5a4b051cfb14 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] No waiting events found dispatching network-vif-unplugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.553 237285 WARNING nova.compute.manager [req-c9a58364-1111-48b9-8ce2-f761045e945c req-90ee36cc-796a-4eb8-b4e7-5a4b051cfb14 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received unexpected event network-vif-unplugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 for instance with vm_state active and task_state powering-off.#033[00m Dec 6 04:52:13 localhost podman[237831]: 2025-12-06 09:52:13.585976086 +0000 UTC m=+0.213255936 container cleanup 750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=) Dec 6 04:52:13 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:52:13 localhost podman[237869]: 2025-12-06 09:52:13.606452354 +0000 UTC m=+0.159715358 container cleanup 750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, batch=17.1_20251118.1) Dec 6 04:52:13 localhost systemd[1]: libpod-conmon-750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565.scope: Deactivated successfully. Dec 6 04:52:13 localhost podman[237892]: 2025-12-06 09:52:13.67203972 +0000 UTC m=+0.066248988 container remove 750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:14:25Z) Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.676 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[69c8d7b8-2e76-413f-b505-45763ea2662a]: (4, ('Sat Dec 6 09:52:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309 (750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565)\n750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565\nSat Dec 6 09:52:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309 (750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565)\n750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.678 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[915f062b-d04d-42eb-83d5-be172a93cb5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.680 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20509a6a-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.682 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost kernel: device tap20509a6a-c0 left promiscuous mode Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.695 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.697 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[8807d3d8-0318-416f-8bbc-2aa8a87dce04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.713 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c70cefa3-b7bc-4d84-8615-dac0a33d8f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.714 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[23d6c805-cf6f-4c61-82f2-3f2cb35d60fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.727 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ee23a342-c3e7-41a1-85fd-63940795d1b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 670913, 'reachable_time': 15516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237913, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.735 137391 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 6 04:52:13 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:13.736 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[9d67625d-9b56-45fa-a717-eb9240c63cbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.819 237285 INFO nova.virt.libvirt.driver [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Instance shutdown successfully after 3 seconds.#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.826 237285 INFO nova.virt.libvirt.driver [-] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Instance destroyed successfully.#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.827 237285 DEBUG nova.objects.instance [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'numa_topology' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.849 237285 DEBUG nova.compute.manager [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:13 localhost nova_compute[237281]: 2025-12-06 09:52:13.950 237285 DEBUG oslo_concurrency.lockutils [None req-16275b01-25eb-4ec4-8847-997847dd26b8 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:14 localhost systemd[1]: var-lib-containers-storage-overlay-e74f8bca4fd35e1e90cba34b801511904a738e55eeed6e861124d8e25abc2790-merged.mount: Deactivated successfully. Dec 6 04:52:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-750555ee41f5e37dfbcb66722f031d507723d9ae0d58a137f4e4dc1b152fd565-userdata-shm.mount: Deactivated successfully. Dec 6 04:52:14 localhost systemd[1]: run-netns-ovnmeta\x2d20509a6a\x2dc438\x2d4c5e\x2d82a7\x2dfe0ea272b309.mount: Deactivated successfully. Dec 6 04:52:15 localhost nova_compute[237281]: 2025-12-06 09:52:15.639 237285 DEBUG nova.compute.manager [req-df8d8a35-398e-4bad-8a37-a44c5ecd6c9c req-a99829a0-459c-4860-927b-1c4102cca5e0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received event network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 04:52:15 localhost nova_compute[237281]: 2025-12-06 09:52:15.639 237285 DEBUG oslo_concurrency.lockutils [req-df8d8a35-398e-4bad-8a37-a44c5ecd6c9c req-a99829a0-459c-4860-927b-1c4102cca5e0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:15 localhost nova_compute[237281]: 2025-12-06 09:52:15.640 237285 DEBUG oslo_concurrency.lockutils [req-df8d8a35-398e-4bad-8a37-a44c5ecd6c9c req-a99829a0-459c-4860-927b-1c4102cca5e0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:15 localhost nova_compute[237281]: 2025-12-06 09:52:15.640 237285 DEBUG oslo_concurrency.lockutils [req-df8d8a35-398e-4bad-8a37-a44c5ecd6c9c req-a99829a0-459c-4860-927b-1c4102cca5e0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:15 localhost nova_compute[237281]: 2025-12-06 09:52:15.640 237285 DEBUG nova.compute.manager [req-df8d8a35-398e-4bad-8a37-a44c5ecd6c9c req-a99829a0-459c-4860-927b-1c4102cca5e0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] No waiting events found dispatching network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 04:52:15 localhost nova_compute[237281]: 2025-12-06 09:52:15.641 237285 WARNING nova.compute.manager [req-df8d8a35-398e-4bad-8a37-a44c5ecd6c9c req-a99829a0-459c-4860-927b-1c4102cca5e0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received unexpected event network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 for instance with vm_state stopped and task_state None.#033[00m Dec 6 04:52:15 localhost nova_compute[237281]: 2025-12-06 09:52:15.907 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:16 localhost openstack_network_exporter[199751]: ERROR 09:52:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:52:16 localhost openstack_network_exporter[199751]: ERROR 09:52:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:52:16 localhost openstack_network_exporter[199751]: ERROR 09:52:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:52:16 localhost openstack_network_exporter[199751]: ERROR 09:52:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:52:16 localhost openstack_network_exporter[199751]: Dec 6 04:52:16 localhost openstack_network_exporter[199751]: ERROR 09:52:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:52:16 localhost openstack_network_exporter[199751]: Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.064 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.871 237285 DEBUG nova.compute.manager [None req-de6ff09f-abfb-4f43-92db-d66587c29d31 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server [None req-de6ff09f-abfb-4f43-92db-d66587c29d31 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server raise self.value Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server raise self.value Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 6 04:52:17 localhost nova_compute[237281]: 2025-12-06 09:52:17.894 237285 ERROR oslo_messaging.rpc.server #033[00m Dec 6 04:52:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:52:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:52:18 localhost podman[237915]: 2025-12-06 09:52:18.550433089 +0000 UTC m=+0.082596809 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 04:52:18 localhost podman[237915]: 2025-12-06 09:52:18.559102594 +0000 UTC m=+0.091266354 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:52:18 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:52:18 localhost podman[237916]: 2025-12-06 09:52:18.658649171 +0000 UTC m=+0.187332374 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:52:18 localhost podman[237916]: 2025-12-06 09:52:18.674241958 +0000 UTC m=+0.202925131 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2) Dec 6 04:52:18 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:52:20 localhost nova_compute[237281]: 2025-12-06 09:52:20.908 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:22 localhost nova_compute[237281]: 2025-12-06 09:52:22.066 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.985 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.985 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.987 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.988 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.989 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.990 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.990 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.992 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.993 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.994 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.995 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.996 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.997 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.997 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.998 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of memory.usage: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.998 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:22.999 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of cpu: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.001 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.002 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.003 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.004 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.005 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.005 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.007 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.008 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.009 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:52:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:52:23.010 12 DEBUG ceilometer.compute.pollsters [-] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 6 04:52:23 localhost podman[197801]: time="2025-12-06T09:52:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:52:23 localhost podman[197801]: @ - - [06/Dec/2025:09:52:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 141137 "" "Go-http-client/1.1" Dec 6 04:52:23 localhost podman[197801]: @ - - [06/Dec/2025:09:52:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 14966 "" "Go-http-client/1.1" Dec 6 04:52:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60699 DF PROTO=TCP SPT=59832 DPT=9102 SEQ=1829657865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1EBCA0000000001030307) Dec 6 04:52:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60700 DF PROTO=TCP SPT=59832 DPT=9102 SEQ=1829657865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1EFC70000000001030307) Dec 6 04:52:25 localhost nova_compute[237281]: 2025-12-06 09:52:25.912 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:25 localhost nova_compute[237281]: 2025-12-06 09:52:25.966 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:25 localhost nova_compute[237281]: 2025-12-06 09:52:25.966 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:25 localhost nova_compute[237281]: 2025-12-06 09:52:25.967 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:52:25 localhost nova_compute[237281]: 2025-12-06 09:52:25.967 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:52:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54466 DF PROTO=TCP SPT=39484 DPT=9102 SEQ=3030395745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1F3870000000001030307) Dec 6 04:52:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60701 DF PROTO=TCP SPT=59832 DPT=9102 SEQ=1829657865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1F7C70000000001030307) Dec 6 04:52:27 localhost nova_compute[237281]: 2025-12-06 09:52:27.068 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:27 localhost nova_compute[237281]: 2025-12-06 09:52:27.529 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:52:27 localhost nova_compute[237281]: 2025-12-06 09:52:27.529 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:52:27 localhost nova_compute[237281]: 2025-12-06 09:52:27.530 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:52:27 localhost nova_compute[237281]: 2025-12-06 09:52:27.530 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14393 DF PROTO=TCP SPT=50264 DPT=9102 SEQ=2109830551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC1FB880000000001030307) Dec 6 04:52:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:52:28 localhost nova_compute[237281]: 2025-12-06 09:52:28.461 237285 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 04:52:28 localhost nova_compute[237281]: 2025-12-06 09:52:28.461 237285 INFO nova.compute.manager [-] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] VM Stopped (Lifecycle Event)#033[00m Dec 6 04:52:28 localhost nova_compute[237281]: 2025-12-06 09:52:28.510 237285 DEBUG nova.compute.manager [None req-bd3b3802-3b1e-44d0-91d3-d4e294afc1fc - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:28 localhost nova_compute[237281]: 2025-12-06 09:52:28.514 237285 DEBUG nova.compute.manager [None req-bd3b3802-3b1e-44d0-91d3-d4e294afc1fc - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 04:52:28 localhost podman[237953]: 2025-12-06 09:52:28.552351931 +0000 UTC m=+0.081065691 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 04:52:28 localhost podman[237953]: 2025-12-06 09:52:28.56408066 +0000 UTC m=+0.092794440 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container) Dec 6 04:52:28 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.118 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.142 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.143 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.144 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.144 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.145 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.145 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.146 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.146 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.147 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.147 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.163 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.163 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.164 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.164 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.220 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.270 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.271 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.321 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.321 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.363 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.363 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.420 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.598 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.600 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12991MB free_disk=387.3130989074707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.600 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.601 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.675 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.676 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.677 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.725 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.749 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.771 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:52:29 localhost nova_compute[237281]: 2025-12-06 09:52:29.772 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:52:30 localhost podman[237984]: 2025-12-06 09:52:30.529669854 +0000 UTC m=+0.070926592 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:52:30 localhost podman[237984]: 2025-12-06 09:52:30.536118351 +0000 UTC m=+0.077375139 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:52:30 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:52:30 localhost nova_compute[237281]: 2025-12-06 09:52:30.913 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60702 DF PROTO=TCP SPT=59832 DPT=9102 SEQ=1829657865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC207870000000001030307) Dec 6 04:52:32 localhost nova_compute[237281]: 2025-12-06 09:52:32.083 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:33 localhost sshd[238007]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:52:35 localhost nova_compute[237281]: 2025-12-06 09:52:35.915 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:37 localhost nova_compute[237281]: 2025-12-06 09:52:37.086 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60703 DF PROTO=TCP SPT=59832 DPT=9102 SEQ=1829657865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC227880000000001030307) Dec 6 04:52:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:52:40 localhost podman[238009]: 2025-12-06 09:52:40.548697376 +0000 UTC m=+0.081866995 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 6 04:52:40 localhost podman[238009]: 2025-12-06 09:52:40.62036513 +0000 UTC m=+0.153534719 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:52:40 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:52:40 localhost nova_compute[237281]: 2025-12-06 09:52:40.917 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.763 237285 DEBUG nova.compute.manager [None req-6a33759c-5ff6-45b7-b7a3-71a2c53c87e6 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server [None req-6a33759c-5ff6-45b7-b7a3-71a2c53c87e6 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server raise self.value Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server raise self.value Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 6 04:52:41 localhost nova_compute[237281]: 2025-12-06 09:52:41.783 237285 ERROR oslo_messaging.rpc.server #033[00m Dec 6 04:52:42 localhost nova_compute[237281]: 2025-12-06 09:52:42.120 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:43 localhost ovn_controller[131684]: 2025-12-06T09:52:43Z|00056|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Dec 6 04:52:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:52:43 localhost systemd[1]: tmp-crun.I0wcjW.mount: Deactivated successfully. Dec 6 04:52:43 localhost podman[238036]: 2025-12-06 09:52:43.541029516 +0000 UTC m=+0.078023809 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3) Dec 6 04:52:43 localhost podman[238036]: 2025-12-06 09:52:43.551683902 +0000 UTC m=+0.088678235 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 04:52:43 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:52:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:52:44 localhost systemd[1]: tmp-crun.UyirPu.mount: Deactivated successfully. Dec 6 04:52:44 localhost podman[238056]: 2025-12-06 09:52:44.561824251 +0000 UTC m=+0.092684018 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:52:44 localhost podman[238056]: 2025-12-06 09:52:44.572370643 +0000 UTC m=+0.103230440 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:52:44 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:52:45 localhost nova_compute[237281]: 2025-12-06 09:52:45.920 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:46 localhost openstack_network_exporter[199751]: ERROR 09:52:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:52:46 localhost openstack_network_exporter[199751]: ERROR 09:52:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:52:46 localhost openstack_network_exporter[199751]: ERROR 09:52:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:52:46 localhost openstack_network_exporter[199751]: ERROR 09:52:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:52:46 localhost openstack_network_exporter[199751]: Dec 6 04:52:46 localhost openstack_network_exporter[199751]: ERROR 09:52:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:52:46 localhost openstack_network_exporter[199751]: Dec 6 04:52:47 localhost nova_compute[237281]: 2025-12-06 09:52:47.122 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:47 localhost nova_compute[237281]: 2025-12-06 09:52:47.772 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'flavor' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:47 localhost nova_compute[237281]: 2025-12-06 09:52:47.789 237285 DEBUG oslo_concurrency.lockutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:52:47 localhost nova_compute[237281]: 2025-12-06 09:52:47.790 237285 DEBUG oslo_concurrency.lockutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:52:47 localhost nova_compute[237281]: 2025-12-06 09:52:47.790 237285 DEBUG nova.network.neutron [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 6 04:52:47 localhost nova_compute[237281]: 2025-12-06 09:52:47.791 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.752 237285 DEBUG nova.network.neutron [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.780 237285 DEBUG oslo_concurrency.lockutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.806 237285 INFO nova.virt.libvirt.driver [-] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Instance destroyed successfully.#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.807 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'numa_topology' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.820 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'resources' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.833 237285 DEBUG nova.virt.libvirt.vif [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:38:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005548798.ooo.test',hostname='test',id=2,image_ref='c6562616-bf77-48e6-bb05-431e64af083a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T08:38:42Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='47835b89168945138751a4b216280589',ramdisk_id='',reservation_id='r-h8mij0z5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='c6562616-bf77-48e6-bb05-431e64af083a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-12-06T09:52:13Z,user_data=None,user_id='5220ceda9e4145d395f52fc9fd0365c0',uuid=a5070ada-6b60-4992-a1bf-9e83aaccac93,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.834 237285 DEBUG nova.network.os_vif_util [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Converting VIF {"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.835 237285 DEBUG nova.network.os_vif_util [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.835 237285 DEBUG os_vif [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.839 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.839 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap227fe5b2-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.843 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.846 237285 INFO os_vif [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5')#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.849 237285 DEBUG nova.virt.libvirt.host [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.849 237285 INFO nova.virt.libvirt.host [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] UEFI support detected#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.858 237285 DEBUG nova.virt.libvirt.driver [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Start _get_guest_xml network_info=[{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=c6562616-bf77-48e6-bb05-431e64af083a,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': 'c6562616-bf77-48e6-bb05-431e64af083a'}], 'ephemerals': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'size': 1, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vdb', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.862 237285 WARNING nova.virt.libvirt.driver [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.864 237285 DEBUG nova.virt.libvirt.host [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Searching host: 'np0005548798.ooo.test' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.865 237285 DEBUG nova.virt.libvirt.host [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.867 237285 DEBUG nova.virt.libvirt.host [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Searching host: 'np0005548798.ooo.test' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.868 237285 DEBUG nova.virt.libvirt.host [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.868 237285 DEBUG nova.virt.libvirt.driver [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.869 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T08:37:34Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='7a18e612-6562-4812-b07b-d906254f72f4',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=c6562616-bf77-48e6-bb05-431e64af083a,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.870 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.870 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.870 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.871 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.871 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.872 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.872 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.873 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.873 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.874 237285 DEBUG nova.virt.hardware [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.874 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.985 237285 DEBUG nova.privsep.utils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.987 237285 DEBUG nova.virt.libvirt.vif [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:38:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005548798.ooo.test',hostname='test',id=2,image_ref='c6562616-bf77-48e6-bb05-431e64af083a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T08:38:42Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='47835b89168945138751a4b216280589',ramdisk_id='',reservation_id='r-h8mij0z5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='c6562616-bf77-48e6-bb05-431e64af083a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-12-06T09:52:13Z,user_data=None,user_id='5220ceda9e4145d395f52fc9fd0365c0',uuid=a5070ada-6b60-4992-a1bf-9e83aaccac93,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.988 237285 DEBUG nova.network.os_vif_util [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Converting VIF {"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.989 237285 DEBUG nova.network.os_vif_util [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 04:52:48 localhost nova_compute[237281]: 2025-12-06 09:52:48.991 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'pci_devices' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.008 237285 DEBUG nova.virt.libvirt.driver [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] End _get_guest_xml xml= Dec 6 04:52:49 localhost nova_compute[237281]: a5070ada-6b60-4992-a1bf-9e83aaccac93 Dec 6 04:52:49 localhost nova_compute[237281]: instance-00000002 Dec 6 04:52:49 localhost nova_compute[237281]: 524288 Dec 6 04:52:49 localhost nova_compute[237281]: 1 Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: test Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:48 Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: 512 Dec 6 04:52:49 localhost nova_compute[237281]: 1 Dec 6 04:52:49 localhost nova_compute[237281]: 0 Dec 6 04:52:49 localhost nova_compute[237281]: 1 Dec 6 04:52:49 localhost nova_compute[237281]: 1 Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: admin Dec 6 04:52:49 localhost nova_compute[237281]: admin Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: RDO Dec 6 04:52:49 localhost nova_compute[237281]: OpenStack Compute Dec 6 04:52:49 localhost nova_compute[237281]: 27.5.2-0.20250829104910.6f8decf.el9 Dec 6 04:52:49 localhost nova_compute[237281]: a5070ada-6b60-4992-a1bf-9e83aaccac93 Dec 6 04:52:49 localhost nova_compute[237281]: a5070ada-6b60-4992-a1bf-9e83aaccac93 Dec 6 04:52:49 localhost nova_compute[237281]: Virtual Machine Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: hvm Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: /dev/urandom Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: Dec 6 04:52:49 localhost nova_compute[237281]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.010 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.080 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.082 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.156 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.158 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.216 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.218 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.262 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.263 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.281 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e070c3db7ba7309de3805d58aaf4369c4bd45c2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.326 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e070c3db7ba7309de3805d58aaf4369c4bd45c2 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.327 237285 DEBUG nova.virt.disk.api [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Checking if we can resize image /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.327 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.369 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.370 237285 DEBUG nova.virt.disk.api [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Cannot resize image /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.370 237285 DEBUG nova.objects.instance [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Lazy-loading 'migration_context' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.383 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.427 237285 DEBUG oslo_concurrency.processutils [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.430 237285 DEBUG nova.virt.libvirt.vif [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:38:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005548798.ooo.test',hostname='test',id=2,image_ref='c6562616-bf77-48e6-bb05-431e64af083a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T08:38:42Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='47835b89168945138751a4b216280589',ramdisk_id='',reservation_id='r-h8mij0z5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='c6562616-bf77-48e6-bb05-431e64af083a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T09:52:13Z,user_data=None,user_id='5220ceda9e4145d395f52fc9fd0365c0',uuid=a5070ada-6b60-4992-a1bf-9e83aaccac93,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.431 237285 DEBUG nova.network.os_vif_util [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Converting VIF {"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.433 237285 DEBUG nova.network.os_vif_util [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.433 237285 DEBUG os_vif [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.435 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.436 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.436 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.440 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.440 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap227fe5b2-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.441 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap227fe5b2-a5, col_values=(('external_ids', {'iface-id': '227fe5b2-a5a7-4043-b641-32b6e7c7a7c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:02:64', 'vm-uuid': 'a5070ada-6b60-4992-a1bf-9e83aaccac93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.443 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.447 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.450 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.452 237285 INFO os_vif [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:02:64,bridge_name='br-int',has_traffic_filtering=True,id=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1,network=Network(20509a6a-c438-4c5e-82a7-fe0ea272b309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap227fe5b2-a5')#033[00m Dec 6 04:52:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:52:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:52:49 localhost kernel: device tap227fe5b2-a5 entered promiscuous mode Dec 6 04:52:49 localhost NetworkManager[5965]: [1765014769.5323] manager: (tap227fe5b2-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/15) Dec 6 04:52:49 localhost systemd-udevd[238136]: Network interface NamePolicy= disabled on kernel command line. Dec 6 04:52:49 localhost snmpd[56894]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Dec 6 04:52:49 localhost NetworkManager[5965]: [1765014769.5587] device (tap227fe5b2-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 04:52:49 localhost NetworkManager[5965]: [1765014769.5591] device (tap227fe5b2-a5): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00057|binding|INFO|Claiming lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 for this chassis. Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00058|binding|INFO|227fe5b2-a5a7-4043-b641-32b6e7c7a7c1: Claiming fa:16:3e:91:02:64 192.168.0.189 Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.561 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.568 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.576 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:02:64 192.168.0.189'], port_security=['fa:16:3e:91:02:64 192.168.0.189'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.189/24', 'neutron:device_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '47835b89168945138751a4b216280589', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2bda9e92-c0a1-4c1d-90ae-f2e7495954f8 db4a6c1e-fda3-423f-866c-b4772bef83b6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=66aef1d5-ef14-49e3-b4b5-f1e89f0f9ee8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=227fe5b2-a5a7-4043-b641-32b6e7c7a7c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.578 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 in datapath 20509a6a-c438-4c5e-82a7-fe0ea272b309 bound to our chassis#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.580 137259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20509a6a-c438-4c5e-82a7-fe0ea272b309#033[00m Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00059|ovn_bfd|INFO|Enabled BFD on interface ovn-d3c7df-0 Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-ded858-0 Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-719bf6-0 Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.583 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost systemd[1]: tmp-crun.4H8Bq6.mount: Deactivated successfully. Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.590 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[adaf0547-d707-40d3-bdeb-34790a120c8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.591 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20509a6a-c1 in ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.595 137360 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20509a6a-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.595 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c93edef3-d0d7-4e2c-b557-cdb895f45626]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.598 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a82f3e-ac77-46e7-adf4-72911c087281]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost podman[238103]: 2025-12-06 09:52:49.60689974 +0000 UTC m=+0.134532357 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.607 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.610 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.611 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[371a49e6-3077-46cd-ae31-0628a554f004]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.621 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00062|binding|INFO|Setting lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 ovn-installed in OVS Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00063|binding|INFO|Setting lport 227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 up in Southbound Dec 6 04:52:49 localhost systemd-machined[68273]: New machine qemu-2-instance-00000002. Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.628 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad75f87-6887-4564-981d-cd8ff57665fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.636 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Dec 6 04:52:49 localhost podman[238102]: 2025-12-06 09:52:49.644820301 +0000 UTC m=+0.172175659 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.656 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[70f03d35-b540-452b-bee0-4f607338b1c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.659 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost NetworkManager[5965]: [1765014769.6633] manager: (tap20509a6a-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/16) Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.662 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[63f9aa59-46ba-454c-a0ca-76957c7d376e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost systemd-udevd[238140]: Network interface NamePolicy= disabled on kernel command line. Dec 6 04:52:49 localhost podman[238102]: 2025-12-06 09:52:49.678364857 +0000 UTC m=+0.205720175 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.683 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost podman[238103]: 2025-12-06 09:52:49.696392218 +0000 UTC m=+0.224024785 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.695 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[4ea5269d-ab27-4781-8711-c0b73e5a29c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.697 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[204638e3-a6a0-46da-9d4d-c9e8e4170b0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:52:49 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap20509a6a-c1: link becomes ready Dec 6 04:52:49 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap20509a6a-c0: link becomes ready Dec 6 04:52:49 localhost NetworkManager[5965]: [1765014769.7186] device (tap20509a6a-c0): carrier: link connected Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.723 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[6d63968a-42fb-4ede-be88-a88cc9b8ad6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.737 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e585b19f-9c99-4759-aab1-77b8a0a93661]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20509a6a-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:3b:0a:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1115685, 'reachable_time': 43739, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238186, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.750 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[405404c1-72be-477e-8f08-dbd88b5dee7f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3b:a81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1115685, 'tstamp': 1115685}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238187, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.764 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5185271c-c87a-4a03-b031-0c0365f63d14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20509a6a-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:3b:0a:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1115685, 'reachable_time': 43739, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238188, 'error': None, 'target': 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.788 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[fee8ac54-db50-4987-bc4b-3f57b7287b80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.838 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6339da-e590-40bd-8ea5-cc560fabd470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.839 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20509a6a-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.839 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.840 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20509a6a-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.841 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost kernel: device tap20509a6a-c0 entered promiscuous mode Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.850 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.851 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20509a6a-c0, col_values=(('external_ids', {'iface-id': 'dc760542-e03f-4d48-a573-fabb89636a57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.852 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost ovn_controller[131684]: 2025-12-06T09:52:49Z|00064|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.864 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.865 137259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.865 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4888bd06-70e8-4ae7-b09e-873a8066770c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.866 137259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: global Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: log /dev/log local0 debug Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: log-tag haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309 Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: user root Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: group root Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: maxconn 1024 Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: pidfile /var/lib/neutron/external/pids/20509a6a-c438-4c5e-82a7-fe0ea272b309.pid.haproxy Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: daemon Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: defaults Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: log global Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: mode http Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: option httplog Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: option dontlognull Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: option http-server-close Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: option forwardfor Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: retries 3 Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: timeout http-request 30s Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: timeout connect 30s Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: timeout client 32s Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: timeout server 32s Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: timeout http-keep-alive 30s Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: listen listener Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: bind 169.254.169.254:80 Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: server metadata /var/lib/neutron/metadata_proxy Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: http-request add-header X-OVN-Network-ID 20509a6a-c438-4c5e-82a7-fe0ea272b309 Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 6 04:52:49 localhost ovn_metadata_agent[137254]: 2025-12-06 09:52:49.867 137259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'env', 'PROCESS_TAG=haproxy-20509a6a-c438-4c5e-82a7-fe0ea272b309', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20509a6a-c438-4c5e-82a7-fe0ea272b309.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.961 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.963 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] VM Resumed (Lifecycle Event)#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.966 237285 DEBUG nova.compute.manager [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.971 237285 INFO nova.virt.libvirt.driver [-] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Instance rebooted successfully.#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.972 237285 DEBUG nova.compute.manager [None req-bcde3a13-f9e9-4e2f-9b0b-4c2c73a2ce3e 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.989 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:49 localhost nova_compute[237281]: 2025-12-06 09:52:49.992 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.020 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.020 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.021 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] VM Started (Lifecycle Event)#033[00m Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.039 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.043 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 04:52:50 localhost podman[238229]: Dec 6 04:52:50 localhost podman[238229]: 2025-12-06 09:52:50.297058018 +0000 UTC m=+0.097387151 container create 928ad51028a3aaa205aa7587fe45496aac228c60b25b406815ec673b73cf678b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:52:50 localhost systemd[1]: Started libpod-conmon-928ad51028a3aaa205aa7587fe45496aac228c60b25b406815ec673b73cf678b.scope. Dec 6 04:52:50 localhost systemd[1]: Started libcrun container. Dec 6 04:52:50 localhost podman[238229]: 2025-12-06 09:52:50.250168513 +0000 UTC m=+0.050497696 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 6 04:52:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b47d2cc100102d98fd455e1011d5e4a7e62251ec5db708b774e2d67af12db0c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 04:52:50 localhost podman[238229]: 2025-12-06 09:52:50.368113792 +0000 UTC m=+0.168442925 container init 928ad51028a3aaa205aa7587fe45496aac228c60b25b406815ec673b73cf678b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:52:50 localhost podman[238229]: 2025-12-06 09:52:50.378336695 +0000 UTC m=+0.178665818 container start 928ad51028a3aaa205aa7587fe45496aac228c60b25b406815ec673b73cf678b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:52:50 localhost neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309[238243]: [NOTICE] (238247) : New worker (238249) forked Dec 6 04:52:50 localhost neutron-haproxy-ovnmeta-20509a6a-c438-4c5e-82a7-fe0ea272b309[238243]: [NOTICE] (238247) : Loading success. Dec 6 04:52:50 localhost ovn_controller[131684]: 2025-12-06T09:52:50Z|00065|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.559 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:50 localhost ovn_controller[131684]: 2025-12-06T09:52:50Z|00066|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.573 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:50 localhost systemd[1]: tmp-crun.Ta6XGT.mount: Deactivated successfully. Dec 6 04:52:50 localhost nova_compute[237281]: 2025-12-06 09:52:50.923 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:51 localhost nova_compute[237281]: 2025-12-06 09:52:51.434 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:51 localhost ovn_controller[131684]: 2025-12-06T09:52:51Z|00067|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:52:51 localhost nova_compute[237281]: 2025-12-06 09:52:51.693 237285 DEBUG nova.compute.manager [req-082ce3d6-c453-44e1-ae69-541e8ca1c115 req-843b85db-543e-4c61-b82f-292680b330a7 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received event network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 04:52:51 localhost nova_compute[237281]: 2025-12-06 09:52:51.694 237285 DEBUG oslo_concurrency.lockutils [req-082ce3d6-c453-44e1-ae69-541e8ca1c115 req-843b85db-543e-4c61-b82f-292680b330a7 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:51 localhost nova_compute[237281]: 2025-12-06 09:52:51.694 237285 DEBUG oslo_concurrency.lockutils [req-082ce3d6-c453-44e1-ae69-541e8ca1c115 req-843b85db-543e-4c61-b82f-292680b330a7 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:51 localhost nova_compute[237281]: 2025-12-06 09:52:51.694 237285 DEBUG oslo_concurrency.lockutils [req-082ce3d6-c453-44e1-ae69-541e8ca1c115 req-843b85db-543e-4c61-b82f-292680b330a7 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:51 localhost nova_compute[237281]: 2025-12-06 09:52:51.694 237285 DEBUG nova.compute.manager [req-082ce3d6-c453-44e1-ae69-541e8ca1c115 req-843b85db-543e-4c61-b82f-292680b330a7 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] No waiting events found dispatching network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 04:52:51 localhost nova_compute[237281]: 2025-12-06 09:52:51.695 237285 WARNING nova.compute.manager [req-082ce3d6-c453-44e1-ae69-541e8ca1c115 req-843b85db-543e-4c61-b82f-292680b330a7 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received unexpected event network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 for instance with vm_state active and task_state None.#033[00m Dec 6 04:52:53 localhost podman[197801]: time="2025-12-06T09:52:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:52:53 localhost podman[197801]: @ - - [06/Dec/2025:09:52:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:52:53 localhost podman[197801]: @ - - [06/Dec/2025:09:52:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15438 "" "Go-http-client/1.1" Dec 6 04:52:53 localhost nova_compute[237281]: 2025-12-06 09:52:53.766 237285 DEBUG nova.compute.manager [req-99f34d98-0ff7-490d-ab9e-7d56965b04c2 req-3af75b9b-a0e7-42f3-8af0-fd179877df6a 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received event network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 04:52:53 localhost nova_compute[237281]: 2025-12-06 09:52:53.767 237285 DEBUG oslo_concurrency.lockutils [req-99f34d98-0ff7-490d-ab9e-7d56965b04c2 req-3af75b9b-a0e7-42f3-8af0-fd179877df6a 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:52:53 localhost nova_compute[237281]: 2025-12-06 09:52:53.767 237285 DEBUG oslo_concurrency.lockutils [req-99f34d98-0ff7-490d-ab9e-7d56965b04c2 req-3af75b9b-a0e7-42f3-8af0-fd179877df6a 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:52:53 localhost nova_compute[237281]: 2025-12-06 09:52:53.767 237285 DEBUG oslo_concurrency.lockutils [req-99f34d98-0ff7-490d-ab9e-7d56965b04c2 req-3af75b9b-a0e7-42f3-8af0-fd179877df6a 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:52:53 localhost nova_compute[237281]: 2025-12-06 09:52:53.767 237285 DEBUG nova.compute.manager [req-99f34d98-0ff7-490d-ab9e-7d56965b04c2 req-3af75b9b-a0e7-42f3-8af0-fd179877df6a 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] No waiting events found dispatching network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 04:52:53 localhost nova_compute[237281]: 2025-12-06 09:52:53.767 237285 WARNING nova.compute.manager [req-99f34d98-0ff7-490d-ab9e-7d56965b04c2 req-3af75b9b-a0e7-42f3-8af0-fd179877df6a 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Received unexpected event network-vif-plugged-227fe5b2-a5a7-4043-b641-32b6e7c7a7c1 for instance with vm_state active and task_state None.#033[00m Dec 6 04:52:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8229 DF PROTO=TCP SPT=58072 DPT=9102 SEQ=2373452244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC260FA0000000001030307) Dec 6 04:52:54 localhost nova_compute[237281]: 2025-12-06 09:52:54.446 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8230 DF PROTO=TCP SPT=58072 DPT=9102 SEQ=2373452244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC265070000000001030307) Dec 6 04:52:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60704 DF PROTO=TCP SPT=59832 DPT=9102 SEQ=1829657865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC267870000000001030307) Dec 6 04:52:55 localhost nova_compute[237281]: 2025-12-06 09:52:55.959 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8231 DF PROTO=TCP SPT=58072 DPT=9102 SEQ=2373452244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC26D070000000001030307) Dec 6 04:52:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54467 DF PROTO=TCP SPT=39484 DPT=9102 SEQ=3030395745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC271870000000001030307) Dec 6 04:52:59 localhost nova_compute[237281]: 2025-12-06 09:52:59.451 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:52:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:52:59 localhost podman[238259]: 2025-12-06 09:52:59.528529772 +0000 UTC m=+0.061884584 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7) Dec 6 04:52:59 localhost podman[238259]: 2025-12-06 09:52:59.54250494 +0000 UTC m=+0.075859752 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal) Dec 6 04:52:59 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:53:00 localhost nova_compute[237281]: 2025-12-06 09:53:00.960 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8232 DF PROTO=TCP SPT=58072 DPT=9102 SEQ=2373452244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC27CC70000000001030307) Dec 6 04:53:01 localhost ovn_controller[131684]: 2025-12-06T09:53:01Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:91:02:64 192.168.0.189 Dec 6 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:53:01 localhost podman[238291]: 2025-12-06 09:53:01.550750338 +0000 UTC m=+0.082300949 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:53:01 localhost podman[238291]: 2025-12-06 09:53:01.559081423 +0000 UTC m=+0.090632024 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:53:01 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:53:04 localhost nova_compute[237281]: 2025-12-06 09:53:04.455 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:05 localhost nova_compute[237281]: 2025-12-06 09:53:05.990 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:06.674 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:06.675 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:06.676 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:06.757 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:06.759 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:06 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:07.941 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:07.941 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.1822925#033[00m Dec 6 04:53:07 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44830 [06/Dec/2025:09:53:06.756] listener listener/metadata 0/0/0/1185/1185 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:07.955 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:07.956 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:07 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44832 [06/Dec/2025:09:53:07.955] listener listener/metadata 0/0/0/35/35 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Dec 6 04:53:07 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:07.990 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 0.0337758#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.005 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.006 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.028 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.029 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.0231035#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44840 [06/Dec/2025:09:53:08.004] listener listener/metadata 0/0/0/24/24 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.035 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.036 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.068 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.069 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.0326974#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44852 [06/Dec/2025:09:53:08.035] listener listener/metadata 0/0/0/34/34 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.076 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.077 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.098 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44856 [06/Dec/2025:09:53:08.076] listener listener/metadata 0/0/0/23/23 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.099 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.0215604#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.106 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.107 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.130 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44858 [06/Dec/2025:09:53:08.105] listener listener/metadata 0/0/0/25/25 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.130 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 149 time: 0.0236919#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.137 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.138 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.158 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44874 [06/Dec/2025:09:53:08.137] listener listener/metadata 0/0/0/22/22 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.159 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.0211029#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.165 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.166 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.185 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44882 [06/Dec/2025:09:53:08.165] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.186 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.0196502#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.193 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.194 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.215 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44892 [06/Dec/2025:09:53:08.192] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.215 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.0212474#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.223 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.224 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44908 [06/Dec/2025:09:53:08.223] listener listener/metadata 0/0/0/21/21 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.244 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.0206232#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.259 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.260 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.287 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.288 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.0278108#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44912 [06/Dec/2025:09:53:08.259] listener listener/metadata 0/0/0/29/29 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.293 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.294 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.312 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44918 [06/Dec/2025:09:53:08.292] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.313 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.0191369#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.318 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.319 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.344 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44932 [06/Dec/2025:09:53:08.318] listener listener/metadata 0/0/0/26/26 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.344 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.0252984#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.351 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.352 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.371 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44936 [06/Dec/2025:09:53:08.350] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.371 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.0193627#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.378 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.379 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.397 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.398 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.0186648#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44948 [06/Dec/2025:09:53:08.378] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.404 137355 DEBUG eventlet.wsgi.server [-] (137355) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.405 137355 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Accept: */*#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Connection: close#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Content-Type: text/plain#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: Host: 169.254.169.254#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: User-Agent: curl/7.84.0#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Forwarded-For: 192.168.0.189#015 Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: X-Ovn-Network-Id: 20509a6a-c438-4c5e-82a7-fe0ea272b309 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.422 137355 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 6 04:53:08 localhost ovn_metadata_agent[137254]: 2025-12-06 09:53:08.423 137355 INFO eventlet.wsgi.server [-] 192.168.0.189, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.0177252#033[00m Dec 6 04:53:08 localhost haproxy-metadata-proxy-20509a6a-c438-4c5e-82a7-fe0ea272b309[238249]: 192.168.0.189:44964 [06/Dec/2025:09:53:08.404] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Dec 6 04:53:09 localhost nova_compute[237281]: 2025-12-06 09:53:09.458 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8233 DF PROTO=TCP SPT=58072 DPT=9102 SEQ=2373452244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC29D870000000001030307) Dec 6 04:53:11 localhost nova_compute[237281]: 2025-12-06 09:53:11.024 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:53:11 localhost podman[238314]: 2025-12-06 09:53:11.547599754 +0000 UTC m=+0.080959328 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:53:11 localhost podman[238314]: 2025-12-06 09:53:11.589274109 +0000 UTC m=+0.122633713 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:53:11 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:53:11 localhost nova_compute[237281]: 2025-12-06 09:53:11.622 237285 DEBUG nova.compute.manager [None req-869c6893-0587-4016-bd91-ccf9a69f517f 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 04:53:11 localhost nova_compute[237281]: 2025-12-06 09:53:11.628 237285 INFO nova.compute.manager [None req-869c6893-0587-4016-bd91-ccf9a69f517f 5220ceda9e4145d395f52fc9fd0365c0 47835b89168945138751a4b216280589 - - default default] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Retrieving diagnostics#033[00m Dec 6 04:53:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:53:14 localhost nova_compute[237281]: 2025-12-06 09:53:14.461 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:14 localhost podman[238339]: 2025-12-06 09:53:14.558680437 +0000 UTC m=+0.091917713 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 04:53:14 localhost podman[238339]: 2025-12-06 09:53:14.598217596 +0000 UTC m=+0.131454952 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 6 04:53:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:53:14 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:53:14 localhost podman[238358]: 2025-12-06 09:53:14.714965639 +0000 UTC m=+0.080379231 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:53:14 localhost podman[238358]: 2025-12-06 09:53:14.751327742 +0000 UTC m=+0.116741294 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:53:14 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:53:16 localhost nova_compute[237281]: 2025-12-06 09:53:16.053 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:16 localhost openstack_network_exporter[199751]: ERROR 09:53:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:53:16 localhost openstack_network_exporter[199751]: ERROR 09:53:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:53:16 localhost openstack_network_exporter[199751]: ERROR 09:53:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:53:16 localhost openstack_network_exporter[199751]: ERROR 09:53:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:53:16 localhost openstack_network_exporter[199751]: Dec 6 04:53:16 localhost openstack_network_exporter[199751]: ERROR 09:53:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:53:16 localhost openstack_network_exporter[199751]: Dec 6 04:53:19 localhost nova_compute[237281]: 2025-12-06 09:53:19.465 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:19 localhost ovn_controller[131684]: 2025-12-06T09:53:19Z|00068|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory Dec 6 04:53:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:53:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:53:20 localhost podman[238381]: 2025-12-06 09:53:20.535362062 +0000 UTC m=+0.073534891 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 04:53:20 localhost podman[238382]: 2025-12-06 09:53:20.603147946 +0000 UTC m=+0.135886988 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 04:53:20 localhost podman[238382]: 2025-12-06 09:53:20.617225127 +0000 UTC m=+0.149964209 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 04:53:20 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:53:20 localhost podman[238381]: 2025-12-06 09:53:20.66960372 +0000 UTC m=+0.207776539 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 04:53:20 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:53:21 localhost nova_compute[237281]: 2025-12-06 09:53:21.056 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:23 localhost podman[197801]: time="2025-12-06T09:53:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:53:23 localhost podman[197801]: @ - - [06/Dec/2025:09:53:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:53:23 localhost podman[197801]: @ - - [06/Dec/2025:09:53:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15450 "" "Go-http-client/1.1" Dec 6 04:53:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65478 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2238741291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC2D62B0000000001030307) Dec 6 04:53:24 localhost nova_compute[237281]: 2025-12-06 09:53:24.469 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65479 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2238741291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC2DA470000000001030307) Dec 6 04:53:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8234 DF PROTO=TCP SPT=58072 DPT=9102 SEQ=2373452244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC2DD870000000001030307) Dec 6 04:53:26 localhost nova_compute[237281]: 2025-12-06 09:53:26.092 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65480 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2238741291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC2E2470000000001030307) Dec 6 04:53:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60705 DF PROTO=TCP SPT=59832 DPT=9102 SEQ=1829657865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC2E5880000000001030307) Dec 6 04:53:29 localhost nova_compute[237281]: 2025-12-06 09:53:29.474 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:29 localhost nova_compute[237281]: 2025-12-06 09:53:29.687 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:29 localhost nova_compute[237281]: 2025-12-06 09:53:29.688 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:29 localhost nova_compute[237281]: 2025-12-06 09:53:29.727 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:29 localhost nova_compute[237281]: 2025-12-06 09:53:29.728 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:53:29 localhost nova_compute[237281]: 2025-12-06 09:53:29.728 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:53:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:53:30 localhost podman[238417]: 2025-12-06 09:53:30.545374099 +0000 UTC m=+0.081221876 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9) Dec 6 04:53:30 localhost nova_compute[237281]: 2025-12-06 09:53:30.560 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:53:30 localhost nova_compute[237281]: 2025-12-06 09:53:30.560 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:53:30 localhost nova_compute[237281]: 2025-12-06 09:53:30.560 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:53:30 localhost nova_compute[237281]: 2025-12-06 09:53:30.561 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:53:30 localhost podman[238417]: 2025-12-06 09:53:30.564316509 +0000 UTC m=+0.100164236 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 04:53:30 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:53:31 localhost nova_compute[237281]: 2025-12-06 09:53:31.125 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65481 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2238741291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC2F2070000000001030307) Dec 6 04:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:53:32 localhost podman[238437]: 2025-12-06 09:53:32.541930021 +0000 UTC m=+0.075275204 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:53:32 localhost podman[238437]: 2025-12-06 09:53:32.547340257 +0000 UTC m=+0.080685470 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:53:32 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.782 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.806 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.807 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.807 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.808 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.808 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.808 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.809 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.809 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.809 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.810 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.829 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.830 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.830 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.831 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.927 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:53:32 localhost nova_compute[237281]: 2025-12-06 09:53:32.999 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.001 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.058 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.060 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.113 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.114 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.188 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.362 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.363 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12749MB free_disk=387.3129005432129GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.363 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.364 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.443 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.444 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.444 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.494 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.509 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.529 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:53:33 localhost nova_compute[237281]: 2025-12-06 09:53:33.529 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:53:34 localhost nova_compute[237281]: 2025-12-06 09:53:34.478 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:36 localhost nova_compute[237281]: 2025-12-06 09:53:36.128 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65482 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2238741291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC311870000000001030307) Dec 6 04:53:39 localhost nova_compute[237281]: 2025-12-06 09:53:39.482 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:41 localhost nova_compute[237281]: 2025-12-06 09:53:41.174 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:53:42 localhost podman[238474]: 2025-12-06 09:53:42.558703485 +0000 UTC m=+0.085022442 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 04:53:42 localhost podman[238474]: 2025-12-06 09:53:42.624308423 +0000 UTC m=+0.150627370 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:53:42 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:53:44 localhost nova_compute[237281]: 2025-12-06 09:53:44.486 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:53:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:53:45 localhost podman[238500]: 2025-12-06 09:53:45.534389176 +0000 UTC m=+0.066886898 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:53:45 localhost podman[238500]: 2025-12-06 09:53:45.565976062 +0000 UTC m=+0.098473834 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:53:45 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:53:45 localhost podman[238501]: 2025-12-06 09:53:45.655492671 +0000 UTC m=+0.185354162 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:53:45 localhost podman[238501]: 2025-12-06 09:53:45.69729955 +0000 UTC m=+0.227161061 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:53:45 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:53:45 localhost snmpd[56894]: empty variable list in _query Dec 6 04:53:45 localhost snmpd[56894]: empty variable list in _query Dec 6 04:53:46 localhost nova_compute[237281]: 2025-12-06 09:53:46.179 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:46 localhost openstack_network_exporter[199751]: ERROR 09:53:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:53:46 localhost openstack_network_exporter[199751]: ERROR 09:53:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:53:46 localhost openstack_network_exporter[199751]: ERROR 09:53:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:53:46 localhost openstack_network_exporter[199751]: ERROR 09:53:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:53:46 localhost openstack_network_exporter[199751]: Dec 6 04:53:46 localhost openstack_network_exporter[199751]: ERROR 09:53:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:53:46 localhost openstack_network_exporter[199751]: Dec 6 04:53:49 localhost nova_compute[237281]: 2025-12-06 09:53:49.488 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:51 localhost nova_compute[237281]: 2025-12-06 09:53:51.226 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:53:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:53:51 localhost podman[238545]: 2025-12-06 09:53:51.551075557 +0000 UTC m=+0.081911108 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125) Dec 6 04:53:51 localhost podman[238544]: 2025-12-06 09:53:51.585914733 +0000 UTC m=+0.123994155 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 04:53:51 localhost podman[238545]: 2025-12-06 09:53:51.615889231 +0000 UTC m=+0.146724792 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 04:53:51 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:53:51 localhost podman[238544]: 2025-12-06 09:53:51.669439609 +0000 UTC m=+0.207519081 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent) Dec 6 04:53:51 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:53:53 localhost podman[197801]: time="2025-12-06T09:53:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:53:53 localhost podman[197801]: @ - - [06/Dec/2025:09:53:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:53:53 localhost podman[197801]: @ - - [06/Dec/2025:09:53:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15445 "" "Go-http-client/1.1" Dec 6 04:53:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56228 DF PROTO=TCP SPT=58456 DPT=9102 SEQ=3291388616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC34B590000000001030307) Dec 6 04:53:54 localhost nova_compute[237281]: 2025-12-06 09:53:54.492 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56229 DF PROTO=TCP SPT=58456 DPT=9102 SEQ=3291388616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC34F470000000001030307) Dec 6 04:53:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65483 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2238741291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC351870000000001030307) Dec 6 04:53:56 localhost nova_compute[237281]: 2025-12-06 09:53:56.262 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:53:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56230 DF PROTO=TCP SPT=58456 DPT=9102 SEQ=3291388616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC357470000000001030307) Dec 6 04:53:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8235 DF PROTO=TCP SPT=58072 DPT=9102 SEQ=2373452244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC35B870000000001030307) Dec 6 04:53:59 localhost nova_compute[237281]: 2025-12-06 09:53:59.495 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56231 DF PROTO=TCP SPT=58456 DPT=9102 SEQ=3291388616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC367070000000001030307) Dec 6 04:54:01 localhost nova_compute[237281]: 2025-12-06 09:54:01.287 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:54:01 localhost podman[238580]: 2025-12-06 09:54:01.549820278 +0000 UTC m=+0.079807208 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Dec 6 04:54:01 localhost podman[238580]: 2025-12-06 09:54:01.565319757 +0000 UTC m=+0.095306657 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 6 04:54:01 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:54:03 localhost podman[238612]: 2025-12-06 09:54:03.242879572 +0000 UTC m=+0.083884681 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:54:03 localhost podman[238612]: 2025-12-06 09:54:03.249008817 +0000 UTC m=+0.090013906 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:54:03 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:54:04 localhost nova_compute[237281]: 2025-12-06 09:54:04.498 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:06 localhost nova_compute[237281]: 2025-12-06 09:54:06.288 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:54:06.675 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:54:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:54:06.675 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:54:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:54:06.676 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:54:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56232 DF PROTO=TCP SPT=58456 DPT=9102 SEQ=3291388616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC387870000000001030307) Dec 6 04:54:09 localhost nova_compute[237281]: 2025-12-06 09:54:09.500 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:11 localhost nova_compute[237281]: 2025-12-06 09:54:11.325 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:54:13 localhost podman[238635]: 2025-12-06 09:54:13.533435608 +0000 UTC m=+0.073859416 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 04:54:13 localhost podman[238635]: 2025-12-06 09:54:13.573256975 +0000 UTC m=+0.113680813 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 04:54:13 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:54:14 localhost nova_compute[237281]: 2025-12-06 09:54:14.503 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:16 localhost openstack_network_exporter[199751]: ERROR 09:54:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:54:16 localhost openstack_network_exporter[199751]: ERROR 09:54:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:54:16 localhost openstack_network_exporter[199751]: ERROR 09:54:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:54:16 localhost openstack_network_exporter[199751]: ERROR 09:54:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:54:16 localhost openstack_network_exporter[199751]: Dec 6 04:54:16 localhost openstack_network_exporter[199751]: ERROR 09:54:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:54:16 localhost openstack_network_exporter[199751]: Dec 6 04:54:16 localhost nova_compute[237281]: 2025-12-06 09:54:16.349 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:54:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:54:16 localhost podman[238660]: 2025-12-06 09:54:16.554025182 +0000 UTC m=+0.079023514 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:54:16 localhost podman[238660]: 2025-12-06 09:54:16.563965092 +0000 UTC m=+0.088963414 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:54:16 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:54:16 localhost podman[238661]: 2025-12-06 09:54:16.614240244 +0000 UTC m=+0.136329298 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:54:16 localhost podman[238661]: 2025-12-06 09:54:16.625018731 +0000 UTC m=+0.147107785 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:54:16 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:54:19 localhost nova_compute[237281]: 2025-12-06 09:54:19.507 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:20 localhost sshd[238702]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:54:21 localhost systemd-logind[760]: New session 47 of user zuul. Dec 6 04:54:21 localhost systemd[1]: Started Session 47 of User zuul. Dec 6 04:54:21 localhost python3[238724]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 04:54:21 localhost nova_compute[237281]: 2025-12-06 09:54:21.379 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:21 localhost subscription-manager[238725]: Unregistered machine with identity: 13ad661d-6c74-404f-ae81-2b24cdeb8ca4 Dec 6 04:54:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:54:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:54:22 localhost podman[238728]: 2025-12-06 09:54:22.552626678 +0000 UTC m=+0.083402466 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:54:22 localhost podman[238728]: 2025-12-06 09:54:22.592193196 +0000 UTC m=+0.122969004 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 04:54:22 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:54:22 localhost podman[238727]: 2025-12-06 09:54:22.609960123 +0000 UTC m=+0.141900156 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:54:22 localhost podman[238727]: 2025-12-06 09:54:22.644227892 +0000 UTC m=+0.176167885 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:54:22 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.986 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.992 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58a9f425-8aaf-4b93-8cd1-aed9e52a964e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:22.988015', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ad32b8c-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': '468dcdc4bddc0871dbb7c5522059f45d07486186c9787c2efa2c29f491d3843a'}]}, 'timestamp': '2025-12-06 09:54:22.993598', '_unique_id': 'd7d1520ae5bc43ddb7bcdc0692601897'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.995 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.997 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e79a6e38-9445-42a1-9d23-26a35243f3c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:22.997048', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ad3c966-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': '9d37fdff5e23ccb6ad7f2c000c59368b5c66ffb1cd611c38a578fc1ddf1b0f3a'}]}, 'timestamp': '2025-12-06 09:54:22.997580', '_unique_id': 'b4542fa0fee04433a0b96b82affd92dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.998 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:22.999 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48583415-8ccd-4f31-b3de-e9d3ef01fdec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:22.999797', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ad4359a-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': 'ab3fe01a4a468914a450e950d69a3a7d5ff9382fd03568c7af0ec30c26be7f5a'}]}, 'timestamp': '2025-12-06 09:54:23.000347', '_unique_id': '5333db3f683f4b58b112684a32e252f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.001 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.020 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.021 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02e7b79b-7abe-4463-a0d2-3a116037e9cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.002535', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ad776ba-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.212647709, 'message_signature': 'a451d77b91d3ae0291720f7bc67cfbc9c501b9776e5535c573557e8bcf3670a8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.002535', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ad78a74-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.212647709, 'message_signature': '351fbb946c24316d942688ed9d14818e7765ec8716adb00f1569f016f76c6645'}]}, 'timestamp': '2025-12-06 09:54:23.022152', '_unique_id': 'a27e8fbda7774405a33b073cd1d564a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.023 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.024 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.045 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 10550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40431671-7504-42b1-8a79-54fe368203cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10550000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:54:23.024633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8adb332c-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.255569639, 'message_signature': '6fdb01f41dab426cf7b8d66518e6dc9a16c7fc49faaa9b374235d94760843a53'}]}, 'timestamp': '2025-12-06 09:54:23.046207', '_unique_id': '54c5dd237d5d458eba47e3518b4cf763'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.048 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7788aca-1931-49ab-8240-b21de5a3dc15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.048692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae2220e-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '6700c3da8be6bbf89dbb2bf3b01eedb57889868d4c933de3ae022be2eedf1dbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.048692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae2341a-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': 'e4add073d3ccb7aff75c3c2f7d03d094e1139f73d27b6861fee263d95962891f'}]}, 'timestamp': '2025-12-06 09:54:23.092106', '_unique_id': 'd4a5dd0b4e6d4193a6cfec87f7739a0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.094 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.095 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5045b5a9-279d-4a25-abe8-df0254d8ad71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:23.095098', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ae2be1c-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': 'bc1c20642216f320e161bb8a5e0ccf05f2439c9dd74567c9b8dcb52716f6f8dd'}]}, 'timestamp': '2025-12-06 09:54:23.095590', '_unique_id': '9b92c3baa51642a69537fb875441439e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6809df9-e77e-455f-a6af-f0ef5a1abe6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:23.097819', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ae32938-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': '5d00c0410f041b78b86ccdbaf5c2f3f70104aca48e42493fb0057e0a96c12f01'}]}, 'timestamp': '2025-12-06 09:54:23.098328', '_unique_id': 'a8a6c437383c4e51b31507d67120f9e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38291302-527a-4f56-bffd-622f898dc93b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:23.100527', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ae391ac-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': 'd8b51e62b6f93a4703e30ecb5e3410333fbc221300e5729ee62ccb27269fe2a2'}]}, 'timestamp': '2025-12-06 09:54:23.101038', '_unique_id': '47292b9c94a844318c4d54ab9dbb9c48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcb3f16c-9637-4bf3-b11a-625f0174d498', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:23.103198', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ae3f9f8-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': '0316d0cd3f4b78872f67e82649e0accff8b10fa3045ae8afd6555ef19b037682'}]}, 'timestamp': '2025-12-06 09:54:23.103670', '_unique_id': '9108834371734bc4865797545973cbbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e305b056-d89d-4147-a105-6960028b55a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.105968', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae465e6-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '61efd5ada44d21c5858ba695cf53b75c09bc787b3072713a137e6829aa614ad0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.105968', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae47658-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '4c381f69d6726160bb2284229e141ddb7f06dc1e69fdb660451b51a2c95b2356'}]}, 'timestamp': '2025-12-06 09:54:23.106828', '_unique_id': '5aa3d753ef4543a29a3ea0450eef5d0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afc85a69-3ea4-425e-adc7-10a8364a1bf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:23.109151', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ae4e296-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': 'a0cb2e1f311e2f6dfdaaa60e36ee2b747707cdbb2d96ee5e4ff1b5a312d58274'}]}, 'timestamp': '2025-12-06 09:54:23.109627', '_unique_id': '0de554f6cd1a4305be33f1dec88dfb46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.111 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '144f8835-83ec-4718-a020-df3c938b4b1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.111938', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae54f38-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.212647709, 'message_signature': '924e2edfbc726c272ec630f654e309f9a813a3073b8127eec7bd8174b995535c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.111938', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae567d4-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.212647709, 'message_signature': 'b62a974042a61a9ae9f6286c2e3a300363eb787be1c09dbe44180f2e87a8c271'}]}, 'timestamp': '2025-12-06 09:54:23.113038', '_unique_id': '9a6866bea5144985bd09e4d4db3f6f31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7024f927-b47f-49a2-9d08-57b6cdb29ee8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.115380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae5d5a2-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '79cfe14a377ca70d459c708b9876aa9b82a2aa2348b8a4b93f5a2cdd0db3f874'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.115380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae5e72c-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '89cce288af6f1ce96d64e5d515f0fb07c1054929a3efb45355eb8f094a3c9e84'}]}, 'timestamp': '2025-12-06 09:54:23.116288', '_unique_id': '38a19d5e705c42eb90a21ef97334bd5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41458cc0-81b9-465c-ba99-e181831c72db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.118462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae64dc0-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '325a79aa233b00d675c5547f427b102bbe42ba746a2b0f1e3ca2505d53a4c727'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.118462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae65f68-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '6d02376856b3209b20034cbab83817681373e4932e26d7fb60545a6c9be4fc82'}]}, 'timestamp': '2025-12-06 09:54:23.119345', '_unique_id': 'c106dd71ce2240d495993b7ba8caa5f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f854353-cc73-492b-842b-066924ce0dc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.121535', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae6c5b6-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '6b8c3a4c8def31388f09767a0086019f1fd9f40b3124a21e27db7c79f3fc18f7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.121535', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae6d75e-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': 'd59bfa0bdf11cebe2443fb2c39b67e8d73914c76492e0b1b3b54bd38bc2809e2'}]}, 'timestamp': '2025-12-06 09:54:23.122414', '_unique_id': '68676d517b6d43438677bf03b51b8760'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.124 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71afcb92-fa84-4cf8-bd5f-154902e6e786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:23.124588', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ae73d48-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': 'b67698248ade1fc997c5585dad70032c139fd2a2faee21e5e47e45f25055f25d'}]}, 'timestamp': '2025-12-06 09:54:23.125084', '_unique_id': '9347b04934004c8985dc835cd7a4bb70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.127 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.127 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cb43370-ce22-435e-ac29-e01ca3737e58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:54:23.127211', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8ae7a382-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.255569639, 'message_signature': '63d2f18d04391fc888b3503d1cbae7b9cd3145e6fe10cdac46c56a924777c251'}]}, 'timestamp': '2025-12-06 09:54:23.127651', '_unique_id': '5339459b1ca642ffba1aeb434bcbb004'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.129 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1a9cafc-7173-49b2-b48a-9dd27de20111', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.129781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae8091c-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '1dc2a7d76e123da94d58db2857e0ef715af31087db07d211592b9ee889cd5db5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.129781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae81966-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.258818797, 'message_signature': '64c1af7d8770491dc916ff1934d3d2ce6de37791de33fa64263adf672fbfad11'}]}, 'timestamp': '2025-12-06 09:54:23.130657', '_unique_id': '93aa6454e5b644099a73ca6489210143'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.133 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36e2c357-1d09-4e11-8cc8-84ee69648644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:54:23.132889', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ae8819e-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.212647709, 'message_signature': '66ff6bd9410d59d4fb47fcf3b386eac473a9f7a13f299e7e0eab86d649a21304'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:54:23.132889', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ae891ac-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.212647709, 'message_signature': '79698b5eb72400263064db03619c4c40d6f5af05eea4bc0305d076b2be621c81'}]}, 'timestamp': '2025-12-06 09:54:23.133737', '_unique_id': 'e1dabcbad9ce4273acf1931c5f28dab8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.135 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aa43b0a-1a08-4e2f-96dd-43ebbf7e7711', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:54:23.135933', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '8ae8f868-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11250.19812618, 'message_signature': 'e16ae670d7f1b81cb44e1ae88185ec494996f8fa44f8ba71e68e301306a6328a'}]}, 'timestamp': '2025-12-06 09:54:23.136437', '_unique_id': 'e9dfd706fd314a56b9c7a0dc65355837'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.138 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:54:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:54:23.138 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:54:23 localhost podman[197801]: time="2025-12-06T09:54:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:54:23 localhost podman[197801]: @ - - [06/Dec/2025:09:54:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:54:23 localhost podman[197801]: @ - - [06/Dec/2025:09:54:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15447 "" "Go-http-client/1.1" Dec 6 04:54:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46921 DF PROTO=TCP SPT=37570 DPT=9102 SEQ=2783222079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC3C08A0000000001030307) Dec 6 04:54:24 localhost nova_compute[237281]: 2025-12-06 09:54:24.511 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46922 DF PROTO=TCP SPT=37570 DPT=9102 SEQ=2783222079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC3C4880000000001030307) Dec 6 04:54:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56233 DF PROTO=TCP SPT=58456 DPT=9102 SEQ=3291388616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC3C7880000000001030307) Dec 6 04:54:26 localhost nova_compute[237281]: 2025-12-06 09:54:26.437 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46923 DF PROTO=TCP SPT=37570 DPT=9102 SEQ=2783222079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC3CC870000000001030307) Dec 6 04:54:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65484 DF PROTO=TCP SPT=43366 DPT=9102 SEQ=2238741291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC3CF870000000001030307) Dec 6 04:54:29 localhost nova_compute[237281]: 2025-12-06 09:54:29.514 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46924 DF PROTO=TCP SPT=37570 DPT=9102 SEQ=2783222079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC3DC470000000001030307) Dec 6 04:54:31 localhost nova_compute[237281]: 2025-12-06 09:54:31.473 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:54:32 localhost podman[238763]: 2025-12-06 09:54:32.94270863 +0000 UTC m=+0.079900761 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 04:54:32 localhost podman[238763]: 2025-12-06 09:54:32.95530272 +0000 UTC m=+0.092494851 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 04:54:32 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:54:33 localhost nova_compute[237281]: 2025-12-06 09:54:33.531 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:33 localhost nova_compute[237281]: 2025-12-06 09:54:33.532 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:33 localhost nova_compute[237281]: 2025-12-06 09:54:33.532 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:54:33 localhost nova_compute[237281]: 2025-12-06 09:54:33.532 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:54:33 localhost podman[238784]: 2025-12-06 09:54:33.532912747 +0000 UTC m=+0.070135695 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:54:33 localhost podman[238784]: 2025-12-06 09:54:33.568292768 +0000 UTC m=+0.105515706 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:54:33 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:54:34 localhost nova_compute[237281]: 2025-12-06 09:54:34.517 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:34 localhost nova_compute[237281]: 2025-12-06 09:54:34.629 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:54:34 localhost nova_compute[237281]: 2025-12-06 09:54:34.630 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:54:34 localhost nova_compute[237281]: 2025-12-06 09:54:34.630 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:54:34 localhost nova_compute[237281]: 2025-12-06 09:54:34.630 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:54:36 localhost nova_compute[237281]: 2025-12-06 09:54:36.476 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:36 localhost nova_compute[237281]: 2025-12-06 09:54:36.988 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.027 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.027 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.028 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.028 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.029 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.029 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.030 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.030 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.030 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.031 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.048 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.048 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.049 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.049 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.110 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.194 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.195 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.252 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.253 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.312 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.313 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.365 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.586 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.588 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12765MB free_disk=387.31443786621094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.589 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.589 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.674 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.675 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.675 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.722 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.738 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.741 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:54:37 localhost nova_compute[237281]: 2025-12-06 09:54:37.741 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:54:38 localhost sshd[238819]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:54:39 localhost nova_compute[237281]: 2025-12-06 09:54:39.521 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46925 DF PROTO=TCP SPT=37570 DPT=9102 SEQ=2783222079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC3FD870000000001030307) Dec 6 04:54:41 localhost nova_compute[237281]: 2025-12-06 09:54:41.520 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:54:44 localhost nova_compute[237281]: 2025-12-06 09:54:44.525 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:44 localhost podman[238821]: 2025-12-06 09:54:44.546670049 +0000 UTC m=+0.077435856 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 04:54:44 localhost podman[238821]: 2025-12-06 09:54:44.579104731 +0000 UTC m=+0.109870468 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:54:44 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:54:46 localhost openstack_network_exporter[199751]: ERROR 09:54:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:54:46 localhost openstack_network_exporter[199751]: ERROR 09:54:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:54:46 localhost openstack_network_exporter[199751]: ERROR 09:54:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:54:46 localhost openstack_network_exporter[199751]: ERROR 09:54:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:54:46 localhost openstack_network_exporter[199751]: Dec 6 04:54:46 localhost openstack_network_exporter[199751]: ERROR 09:54:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:54:46 localhost openstack_network_exporter[199751]: Dec 6 04:54:46 localhost nova_compute[237281]: 2025-12-06 09:54:46.523 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:54:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:54:47 localhost podman[238846]: 2025-12-06 09:54:47.552198196 +0000 UTC m=+0.082615863 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:54:47 localhost podman[238846]: 2025-12-06 09:54:47.562129196 +0000 UTC m=+0.092546853 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:54:47 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:54:47 localhost podman[238847]: 2025-12-06 09:54:47.607466159 +0000 UTC m=+0.135717060 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 04:54:47 localhost podman[238847]: 2025-12-06 09:54:47.617367288 +0000 UTC m=+0.145618189 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd) Dec 6 04:54:47 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:54:49 localhost nova_compute[237281]: 2025-12-06 09:54:49.529 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:51 localhost nova_compute[237281]: 2025-12-06 09:54:51.526 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:53 localhost podman[197801]: time="2025-12-06T09:54:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:54:53 localhost podman[197801]: @ - - [06/Dec/2025:09:54:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:54:53 localhost podman[197801]: @ - - [06/Dec/2025:09:54:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15452 "" "Go-http-client/1.1" Dec 6 04:54:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:54:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:54:53 localhost podman[238890]: 2025-12-06 09:54:53.536065526 +0000 UTC m=+0.074844118 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 04:54:53 localhost podman[238891]: 2025-12-06 09:54:53.604115955 +0000 UTC m=+0.136194373 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:54:53 localhost podman[238891]: 2025-12-06 09:54:53.619204493 +0000 UTC m=+0.151282911 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 04:54:53 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:54:53 localhost podman[238890]: 2025-12-06 09:54:53.671827215 +0000 UTC m=+0.210605877 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 04:54:53 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:54:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58885 DF PROTO=TCP SPT=57384 DPT=9102 SEQ=828243296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC435BA0000000001030307) Dec 6 04:54:54 localhost nova_compute[237281]: 2025-12-06 09:54:54.532 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58886 DF PROTO=TCP SPT=57384 DPT=9102 SEQ=828243296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC439C70000000001030307) Dec 6 04:54:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46926 DF PROTO=TCP SPT=37570 DPT=9102 SEQ=2783222079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC43D870000000001030307) Dec 6 04:54:56 localhost nova_compute[237281]: 2025-12-06 09:54:56.528 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:54:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58887 DF PROTO=TCP SPT=57384 DPT=9102 SEQ=828243296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC441C80000000001030307) Dec 6 04:54:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56234 DF PROTO=TCP SPT=58456 DPT=9102 SEQ=3291388616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4458C0000000001030307) Dec 6 04:54:59 localhost nova_compute[237281]: 2025-12-06 09:54:59.535 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58888 DF PROTO=TCP SPT=57384 DPT=9102 SEQ=828243296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC451880000000001030307) Dec 6 04:55:01 localhost nova_compute[237281]: 2025-12-06 09:55:01.529 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:55:03 localhost podman[238927]: 2025-12-06 09:55:03.272761817 +0000 UTC m=+0.084064316 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 04:55:03 localhost podman[238927]: 2025-12-06 09:55:03.313459649 +0000 UTC m=+0.124762108 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 04:55:03 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:55:04 localhost nova_compute[237281]: 2025-12-06 09:55:04.538 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:04 localhost systemd[1]: tmp-crun.rvnlPl.mount: Deactivated successfully. Dec 6 04:55:04 localhost podman[238948]: 2025-12-06 09:55:04.548449986 +0000 UTC m=+0.080735776 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:55:04 localhost podman[238948]: 2025-12-06 09:55:04.585224969 +0000 UTC m=+0.117510739 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:55:04 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:55:06 localhost nova_compute[237281]: 2025-12-06 09:55:06.531 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:55:06.675 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:55:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:55:06.676 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:55:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:55:06.677 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:55:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58889 DF PROTO=TCP SPT=57384 DPT=9102 SEQ=828243296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC471880000000001030307) Dec 6 04:55:09 localhost nova_compute[237281]: 2025-12-06 09:55:09.541 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:11 localhost nova_compute[237281]: 2025-12-06 09:55:11.534 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:14 localhost nova_compute[237281]: 2025-12-06 09:55:14.544 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:55:15 localhost podman[238972]: 2025-12-06 09:55:15.553179473 +0000 UTC m=+0.085237491 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3) Dec 6 04:55:15 localhost podman[238972]: 2025-12-06 09:55:15.632347599 +0000 UTC m=+0.164405577 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller) Dec 6 04:55:15 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:55:16 localhost openstack_network_exporter[199751]: ERROR 09:55:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:55:16 localhost openstack_network_exporter[199751]: ERROR 09:55:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:55:16 localhost openstack_network_exporter[199751]: ERROR 09:55:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:55:16 localhost openstack_network_exporter[199751]: ERROR 09:55:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:55:16 localhost openstack_network_exporter[199751]: Dec 6 04:55:16 localhost openstack_network_exporter[199751]: ERROR 09:55:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:55:16 localhost openstack_network_exporter[199751]: Dec 6 04:55:16 localhost ovn_controller[131684]: 2025-12-06T09:55:16Z|00069|ovsdb_idl|WARN|transaction error: {"details":"Transaction causes multiple rows in \"MAC_Binding\" table to have identical values (lrp-3bd3d50c-53bb-4be2-9aa7-2224a869356a and \"192.168.122.80\") for index on columns \"logical_port\" and \"ip\". First row, with UUID fff716b6-ff16-4a86-90ee-a79b35eb5436, was inserted by this transaction. Second row, with UUID a4e5bf1c-42ac-4d19-9eae-92375f50a21a, existed in the database before this transaction and was not modified by the transaction.","error":"constraint violation"} Dec 6 04:55:16 localhost ovn_controller[131684]: 2025-12-06T09:55:16Z|00070|main|INFO|OVNSB commit failed, force recompute next time. Dec 6 04:55:16 localhost ovn_controller[131684]: 2025-12-06T09:55:16Z|00071|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 04:55:16 localhost nova_compute[237281]: 2025-12-06 09:55:16.537 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:55:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:55:18 localhost podman[238998]: 2025-12-06 09:55:18.545548741 +0000 UTC m=+0.080923331 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:55:18 localhost podman[238998]: 2025-12-06 09:55:18.556288576 +0000 UTC m=+0.091663176 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:55:18 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:55:18 localhost podman[238999]: 2025-12-06 09:55:18.648718204 +0000 UTC m=+0.179897926 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 04:55:18 localhost podman[238999]: 2025-12-06 09:55:18.664207914 +0000 UTC m=+0.195387666 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true) Dec 6 04:55:18 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:55:19 localhost nova_compute[237281]: 2025-12-06 09:55:19.585 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:21 localhost nova_compute[237281]: 2025-12-06 09:55:21.540 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:21 localhost systemd[1]: session-47.scope: Deactivated successfully. Dec 6 04:55:21 localhost systemd-logind[760]: Session 47 logged out. Waiting for processes to exit. Dec 6 04:55:21 localhost systemd-logind[760]: Removed session 47. Dec 6 04:55:23 localhost podman[197801]: time="2025-12-06T09:55:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:55:23 localhost podman[197801]: @ - - [06/Dec/2025:09:55:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:55:23 localhost podman[197801]: @ - - [06/Dec/2025:09:55:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15449 "" "Go-http-client/1.1" Dec 6 04:55:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15116 DF PROTO=TCP SPT=35476 DPT=9102 SEQ=616039884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4AAE90000000001030307) Dec 6 04:55:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:55:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:55:24 localhost podman[239040]: 2025-12-06 09:55:24.555987597 +0000 UTC m=+0.087697446 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:55:24 localhost podman[239040]: 2025-12-06 09:55:24.56305848 +0000 UTC m=+0.094768349 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 04:55:24 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:55:24 localhost nova_compute[237281]: 2025-12-06 09:55:24.612 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:24 localhost systemd[1]: tmp-crun.eyR3JV.mount: Deactivated successfully. Dec 6 04:55:24 localhost podman[239041]: 2025-12-06 09:55:24.631899875 +0000 UTC m=+0.159543031 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:55:24 localhost podman[239041]: 2025-12-06 09:55:24.646310301 +0000 UTC m=+0.173953467 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:55:24 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:55:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15117 DF PROTO=TCP SPT=35476 DPT=9102 SEQ=616039884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4AF070000000001030307) Dec 6 04:55:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58890 DF PROTO=TCP SPT=57384 DPT=9102 SEQ=828243296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4B1870000000001030307) Dec 6 04:55:26 localhost nova_compute[237281]: 2025-12-06 09:55:26.543 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15118 DF PROTO=TCP SPT=35476 DPT=9102 SEQ=616039884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4B7080000000001030307) Dec 6 04:55:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46927 DF PROTO=TCP SPT=37570 DPT=9102 SEQ=2783222079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4BB870000000001030307) Dec 6 04:55:29 localhost nova_compute[237281]: 2025-12-06 09:55:29.649 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15119 DF PROTO=TCP SPT=35476 DPT=9102 SEQ=616039884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4C6C70000000001030307) Dec 6 04:55:31 localhost nova_compute[237281]: 2025-12-06 09:55:31.546 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.091 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.091 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.109 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.109 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.110 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.709 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.710 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.710 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:55:32 localhost nova_compute[237281]: 2025-12-06 09:55:32.711 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:55:34 localhost podman[239077]: 2025-12-06 09:55:34.396732608 +0000 UTC m=+0.067253647 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Dec 6 04:55:34 localhost podman[239077]: 2025-12-06 09:55:34.413246028 +0000 UTC m=+0.083767077 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, vcs-type=git) Dec 6 04:55:34 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.690 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.940 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.976 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.976 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.977 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.977 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.978 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.978 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.978 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.979 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.979 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:55:34 localhost nova_compute[237281]: 2025-12-06 09:55:34.980 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.003 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.004 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.004 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.005 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.070 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.146 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.147 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.218 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.219 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.271 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.272 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.321 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:55:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.518 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.520 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12761MB free_disk=387.31443786621094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.520 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.521 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:55:35 localhost podman[239109]: 2025-12-06 09:55:35.553123734 +0000 UTC m=+0.084099216 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:55:35 localhost podman[239109]: 2025-12-06 09:55:35.563260962 +0000 UTC m=+0.094236534 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:55:35 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.606 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.606 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.606 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.652 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.668 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.670 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:55:35 localhost nova_compute[237281]: 2025-12-06 09:55:35.671 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:55:36 localhost nova_compute[237281]: 2025-12-06 09:55:36.548 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15120 DF PROTO=TCP SPT=35476 DPT=9102 SEQ=616039884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC4E7870000000001030307) Dec 6 04:55:39 localhost nova_compute[237281]: 2025-12-06 09:55:39.727 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:41 localhost nova_compute[237281]: 2025-12-06 09:55:41.553 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:44 localhost nova_compute[237281]: 2025-12-06 09:55:44.772 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:46 localhost openstack_network_exporter[199751]: ERROR 09:55:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:55:46 localhost openstack_network_exporter[199751]: ERROR 09:55:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:55:46 localhost openstack_network_exporter[199751]: ERROR 09:55:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:55:46 localhost openstack_network_exporter[199751]: ERROR 09:55:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:55:46 localhost openstack_network_exporter[199751]: Dec 6 04:55:46 localhost openstack_network_exporter[199751]: ERROR 09:55:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:55:46 localhost openstack_network_exporter[199751]: Dec 6 04:55:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:55:46 localhost podman[239132]: 2025-12-06 09:55:46.550358137 +0000 UTC m=+0.083341525 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:55:46 localhost nova_compute[237281]: 2025-12-06 09:55:46.554 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:46 localhost podman[239132]: 2025-12-06 09:55:46.614338884 +0000 UTC m=+0.147322222 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 04:55:46 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:55:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:55:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:55:49 localhost podman[239159]: 2025-12-06 09:55:49.55343387 +0000 UTC m=+0.081226590 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:55:49 localhost podman[239159]: 2025-12-06 09:55:49.590195802 +0000 UTC m=+0.117988502 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:55:49 localhost podman[239160]: 2025-12-06 09:55:49.612421245 +0000 UTC m=+0.136595716 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 04:55:49 localhost podman[239160]: 2025-12-06 09:55:49.628205833 +0000 UTC m=+0.152380254 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Dec 6 04:55:49 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:55:49 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:55:49 localhost nova_compute[237281]: 2025-12-06 09:55:49.817 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:51 localhost nova_compute[237281]: 2025-12-06 09:55:51.556 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:53 localhost podman[197801]: time="2025-12-06T09:55:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:55:53 localhost podman[197801]: @ - - [06/Dec/2025:09:55:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:55:53 localhost podman[197801]: @ - - [06/Dec/2025:09:55:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15457 "" "Go-http-client/1.1" Dec 6 04:55:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23038 DF PROTO=TCP SPT=58112 DPT=9102 SEQ=3458578661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC520190000000001030307) Dec 6 04:55:54 localhost nova_compute[237281]: 2025-12-06 09:55:54.846 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23039 DF PROTO=TCP SPT=58112 DPT=9102 SEQ=3458578661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC524070000000001030307) Dec 6 04:55:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:55:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:55:55 localhost podman[239201]: 2025-12-06 09:55:55.548789318 +0000 UTC m=+0.081325743 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:55:55 localhost podman[239202]: 2025-12-06 09:55:55.618070786 +0000 UTC m=+0.145316301 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:55:55 localhost podman[239202]: 2025-12-06 09:55:55.631175163 +0000 UTC m=+0.158420638 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3) Dec 6 04:55:55 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:55:55 localhost podman[239201]: 2025-12-06 09:55:55.683479545 +0000 UTC m=+0.216016010 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 04:55:55 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:55:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15121 DF PROTO=TCP SPT=35476 DPT=9102 SEQ=616039884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC527880000000001030307) Dec 6 04:55:56 localhost nova_compute[237281]: 2025-12-06 09:55:56.559 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:55:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23040 DF PROTO=TCP SPT=58112 DPT=9102 SEQ=3458578661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC52C070000000001030307) Dec 6 04:55:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58891 DF PROTO=TCP SPT=57384 DPT=9102 SEQ=828243296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC52F870000000001030307) Dec 6 04:55:59 localhost nova_compute[237281]: 2025-12-06 09:55:59.880 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23041 DF PROTO=TCP SPT=58112 DPT=9102 SEQ=3458578661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC53BC70000000001030307) Dec 6 04:56:01 localhost nova_compute[237281]: 2025-12-06 09:56:01.563 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:56:04 localhost podman[239238]: 2025-12-06 09:56:04.542786085 +0000 UTC m=+0.079734075 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 04:56:04 localhost podman[239238]: 2025-12-06 09:56:04.562309957 +0000 UTC m=+0.099257967 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 04:56:04 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:56:04 localhost nova_compute[237281]: 2025-12-06 09:56:04.882 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:56:06 localhost podman[239258]: 2025-12-06 09:56:06.547611237 +0000 UTC m=+0.076485246 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:56:06 localhost podman[239258]: 2025-12-06 09:56:06.553247018 +0000 UTC m=+0.082121027 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:56:06 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:56:06 localhost nova_compute[237281]: 2025-12-06 09:56:06.566 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:56:06.676 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:56:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:56:06.677 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:56:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:56:06.677 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:56:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23042 DF PROTO=TCP SPT=58112 DPT=9102 SEQ=3458578661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC55B870000000001030307) Dec 6 04:56:09 localhost nova_compute[237281]: 2025-12-06 09:56:09.924 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:11 localhost nova_compute[237281]: 2025-12-06 09:56:11.568 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:14 localhost nova_compute[237281]: 2025-12-06 09:56:14.926 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:16 localhost openstack_network_exporter[199751]: ERROR 09:56:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:56:16 localhost openstack_network_exporter[199751]: ERROR 09:56:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:56:16 localhost openstack_network_exporter[199751]: ERROR 09:56:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:56:16 localhost openstack_network_exporter[199751]: ERROR 09:56:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:56:16 localhost openstack_network_exporter[199751]: Dec 6 04:56:16 localhost openstack_network_exporter[199751]: ERROR 09:56:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:56:16 localhost openstack_network_exporter[199751]: Dec 6 04:56:16 localhost nova_compute[237281]: 2025-12-06 09:56:16.570 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:56:17 localhost podman[239281]: 2025-12-06 09:56:17.560422797 +0000 UTC m=+0.081660812 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:56:17 localhost podman[239281]: 2025-12-06 09:56:17.594308828 +0000 UTC m=+0.115546773 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.build-date=20251125) Dec 6 04:56:17 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:56:19 localhost nova_compute[237281]: 2025-12-06 09:56:19.965 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:56:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:56:20 localhost podman[239306]: 2025-12-06 09:56:20.548792294 +0000 UTC m=+0.079861056 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:56:20 localhost podman[239306]: 2025-12-06 09:56:20.554621169 +0000 UTC m=+0.085689911 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:56:20 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:56:20 localhost podman[239307]: 2025-12-06 09:56:20.59544937 +0000 UTC m=+0.122170482 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 04:56:20 localhost podman[239307]: 2025-12-06 09:56:20.608418961 +0000 UTC m=+0.135140113 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:56:20 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:56:21 localhost nova_compute[237281]: 2025-12-06 09:56:21.573 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:22.987 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:56:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.030 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.031 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e325a76-72a3-4402-96c6-2055eb1c84d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:22.988069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd25f831a-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '26972706157baa6b64474fd7401f4033ea277493ff9ebc123e5360457ed50de3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:22.988069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd25f97ec-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '4dfcf699cae0721bdcc7156c2e6d83c560a59711bb826fe1baa78a033768b3d3'}]}, 'timestamp': '2025-12-06 09:56:23.032089', '_unique_id': '63535858273c43b580b6b9aa8b9f8bcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.039 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0928f508-f825-4279-a790-5fbcc2fcece0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.035155', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd260e368-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': '428fec53b07acbfd76ae1374b24f61f7e122e585c123b2780e618d15f0c0a092'}]}, 'timestamp': '2025-12-06 09:56:23.040575', '_unique_id': '95704ce6f05745de97b436096583ae1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.042 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '941aba82-51a6-4bab-bc4c-3e62645a0aa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:56:23.043149', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd264751e-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.27341184, 'message_signature': '50661f6349cdb0a43b7a7c5aab53c1323213dddf01a8a9c9c59f128164006d9d'}]}, 'timestamp': '2025-12-06 09:56:23.063957', '_unique_id': 'dffddbf1c5aa4a92a4fd6baba5270bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.064 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.066 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 11160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11d35583-96f4-4b90-baf6-e48c8b6741c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11160000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:56:23.066294', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd264e486-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.27341184, 'message_signature': '41a0e9800bbe07d74965bddb40568cfd5b355c7f925eb12672d233463aa97915'}]}, 'timestamp': '2025-12-06 09:56:23.066775', '_unique_id': '5d0db590d1164295b01f21e7606dc6ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.067 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.068 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.086 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9351887-0ec6-4883-8f0a-fa46ac2fbcfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.069122', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd267f0e0-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.279232915, 'message_signature': 'a4482e89c01a36a3414225e62684371b5ce90ba7b84ed3894d558a10f0333ce5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.069122', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2680ddc-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.279232915, 'message_signature': '8afa08f7af5ee31b2e69bd0ba7160c75381cb8f5f5ee66942fcddbed72cbcbf1'}]}, 'timestamp': '2025-12-06 09:56:23.087570', '_unique_id': 'd519f53969184107abe949a845798635'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4127a96e-5946-457f-9af3-1b78600561b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.090785', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd268a40e-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.279232915, 'message_signature': '882f83461dd4c4662ba8845b01857a12bffe6aaa4cb2adee78512c42d4bacb44'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.090785', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd268b502-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.279232915, 'message_signature': '0ceed1918cbd6f539a0278e08648dc3548cfd35d342b513a91baa66fe14c203b'}]}, 'timestamp': '2025-12-06 09:56:23.091743', '_unique_id': '6be7c024492444c4a9c9f06b830de8c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.094 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de40a451-26ae-4cad-92f7-5e35319da451', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.094017', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd2691fba-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': 'a2c906400f0f15f7ea130564b889c029ef653db557b7e6613632f2ccde792103'}]}, 'timestamp': '2025-12-06 09:56:23.094506', '_unique_id': 'a77d6478b4174ef6a3ae5f70170401c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.096 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66660c77-7938-4677-a7c5-6249867e36f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.096742', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2698b44-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '5139534f11594e0baaf3b73342835b5c73b8a7e970ffa84f983ada1ce053ad20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.096742', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2699c38-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '9438580ca5914675fbbe26e3915115292565194e05e6f802dac600ce565f940a'}]}, 'timestamp': '2025-12-06 09:56:23.097663', '_unique_id': '782b562f8a44493e83ae9db4c4b015cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0de8a3aa-834c-48bf-b856-923021cf3d61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.099971', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26a08da-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': 'ab2c41daf0e4106345c14d3ee334076448dab0668c39a0234984fc3b1a3100c9'}]}, 'timestamp': '2025-12-06 09:56:23.100475', '_unique_id': '60dc4f3f04f4465cb38b50703db10f25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c251c36-ed0a-46d1-b027-bb6bdbcfa285', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.102677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd26a7400-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.279232915, 'message_signature': '031d69653a6c7d1db123395a1e2398160841714d1d8dc86a66d02ddac2544bd0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.102677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd26a8580-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.279232915, 'message_signature': '40ca911afe32a6b36a6d0042a5bb2d32f8708233d6fe79a750fa05053274dcab'}]}, 'timestamp': '2025-12-06 09:56:23.103636', '_unique_id': '1e51fa3228274a64b408d6e0d2a82bce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f00eb6c-bacf-44e8-a4a7-f8884e61cfa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.106169', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26afa1a-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': 'adeca1927b94f795143de94e8346e1d3f51298f63aa421e671624b9d0f6af267'}]}, 'timestamp': '2025-12-06 09:56:23.106652', '_unique_id': 'd74c6e887df04c75ba3cc3e52d7138c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '541da461-c99a-49b7-ad37-711d06bc1357', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.109045', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd26b6a68-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '590e6f8ca522b01e891e47afd51eb577e199bbc54e65f9e27d4d446f180072af'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.109045', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd26b7b8e-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '0eb128e31bbbbaefc563b86e26526c77ee9e226b8269c1f3996cf149696d1c22'}]}, 'timestamp': '2025-12-06 09:56:23.109968', '_unique_id': '688a603aec984344a8040a087a5d78c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14ffe9da-0d0b-46e1-bc87-02292b248c3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.112248', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26be77c-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': '6f1cd594f326b0333adffd4178508ea4a94d21aac2e336929cc007b43a941d0e'}]}, 'timestamp': '2025-12-06 09:56:23.112725', '_unique_id': '180708b05071484cb849261ef5ce0c7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.114 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c459cc42-79ce-4159-be76-4bc0fb814be6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.114943', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26c51b2-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': '11e9ddc0bbda8feef8f0e86e47dbb040e1f728ade3d333094f8e9cf342e07972'}]}, 'timestamp': '2025-12-06 09:56:23.115479', '_unique_id': '30a43db7ef8f4254bc531ffb47847b83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e46f9ae-cd75-4287-b18e-3de94ac63f11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.117823', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26cc2f0-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': '4a926fed5ca0930871c2d1bb77e7d2e6cb1ac194071c6437eaf25f83cd62e63c'}]}, 'timestamp': '2025-12-06 09:56:23.118342', '_unique_id': '3479c3ed21d94c6bbfc821f022daa6ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35baf724-e829-40a2-be11-46fe48f60eab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.120531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd26d2b46-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '28aaa30be54a7810f86bb8a21ec33d6598db9965e71f344b521093de515f9c78'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.120531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd26d3d98-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '7c9611bed55fd41dcd93556d20a926f32e7c2d12e386132aefbb561d171d74ab'}]}, 'timestamp': '2025-12-06 09:56:23.121454', '_unique_id': 'fd028f3f445341b6b1a82a9ae3d2662b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.124 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '616fce2c-1ed0-476a-88d9-8495d6542d8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.123688', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd26da882-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '8d6cffe10496ea64743ae59bef2f951ae41c4de8564c8cc3a2d4b8deb3596d6a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.123688', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd26db962-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '9c6477df9a0872d0c6a9cad574b096f676888ef9e4db1798a37ebc6c28cdbbb8'}]}, 'timestamp': '2025-12-06 09:56:23.124623', '_unique_id': '7ae63a8481c449b6a5349ce6aafa3449'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.127 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14cb3eac-f07f-4bb4-a442-b85c1cb36cf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:56:23.126803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd26e2316-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': '1ef9f21ba43d1053c09c8356ba7e52e98505795537b81a3554a9669f935e9e6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:56:23.126803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd26e2f96-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.198185543, 'message_signature': 'd48abcbdae3ef74a2a17ca3198078e27fb99695ee055160d62f3928aeffa688a'}]}, 'timestamp': '2025-12-06 09:56:23.127567', '_unique_id': '32bfcd581da44fd984c6c9eec3fbae92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '188c1def-c1cd-4a9d-bd8b-b71409b2a5db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.129054', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26e7442-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': '8de2b26a4fef38b5e9e00fc278868db189cd8319659ca2ccb6f849c97bcf1f18'}]}, 'timestamp': '2025-12-06 09:56:23.129353', '_unique_id': '4de1523cbda74d329a0462fc49af27e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '375b81d8-0b9c-48af-9263-dcb805f38fb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.130740', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26eb718-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': '7ef1bb5b61c5dce382a96ff4530b536c52f2e5c3e346858e92dd7d56545fdbe1'}]}, 'timestamp': '2025-12-06 09:56:23.131056', '_unique_id': 'd57938dfd0cd4998977c1376b38ac732'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a3932eb-529c-487e-8af9-21e26a07ec83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:56:23.132475', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'd26ef9e4-d289-11f0-8fed-fa163edf398d', 'monotonic_time': 11370.245288652, 'message_signature': 'ef2947119a73817726ab127933ef5f9bf559078e79fadf54f9e1f71fae8802ca'}]}, 'timestamp': '2025-12-06 09:56:23.132781', '_unique_id': '92b4bf88d7744546a022815b39c8946a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:56:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:56:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 04:56:23 localhost podman[197801]: time="2025-12-06T09:56:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:56:23 localhost podman[197801]: @ - - [06/Dec/2025:09:56:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:56:23 localhost podman[197801]: @ - - [06/Dec/2025:09:56:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15450 "" "Go-http-client/1.1" Dec 6 04:56:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7451 DF PROTO=TCP SPT=56280 DPT=9102 SEQ=570017731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC595490000000001030307) Dec 6 04:56:25 localhost nova_compute[237281]: 2025-12-06 09:56:25.002 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7452 DF PROTO=TCP SPT=56280 DPT=9102 SEQ=570017731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC599470000000001030307) Dec 6 04:56:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23043 DF PROTO=TCP SPT=58112 DPT=9102 SEQ=3458578661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC59B870000000001030307) Dec 6 04:56:25 localhost nova_compute[237281]: 2025-12-06 09:56:25.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:25 localhost nova_compute[237281]: 2025-12-06 09:56:25.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 04:56:25 localhost nova_compute[237281]: 2025-12-06 09:56:25.908 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 04:56:25 localhost nova_compute[237281]: 2025-12-06 09:56:25.908 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:25 localhost nova_compute[237281]: 2025-12-06 09:56:25.909 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 04:56:25 localhost nova_compute[237281]: 2025-12-06 09:56:25.926 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:56:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:56:26 localhost podman[239348]: 2025-12-06 09:56:26.550314038 +0000 UTC m=+0.082027163 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 04:56:26 localhost nova_compute[237281]: 2025-12-06 09:56:26.576 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:26 localhost podman[239349]: 2025-12-06 09:56:26.617467621 +0000 UTC m=+0.145805723 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 04:56:26 localhost podman[239349]: 2025-12-06 09:56:26.62707238 +0000 UTC m=+0.155410682 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:56:26 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:56:26 localhost podman[239348]: 2025-12-06 09:56:26.681651264 +0000 UTC m=+0.213364389 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 04:56:26 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:56:26 localhost nova_compute[237281]: 2025-12-06 09:56:26.941 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:26 localhost nova_compute[237281]: 2025-12-06 09:56:26.941 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:56:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7453 DF PROTO=TCP SPT=56280 DPT=9102 SEQ=570017731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC5A1470000000001030307) Dec 6 04:56:27 localhost nova_compute[237281]: 2025-12-06 09:56:27.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:27 localhost nova_compute[237281]: 2025-12-06 09:56:27.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:27 localhost nova_compute[237281]: 2025-12-06 09:56:27.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15122 DF PROTO=TCP SPT=35476 DPT=9102 SEQ=616039884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC5A5870000000001030307) Dec 6 04:56:28 localhost nova_compute[237281]: 2025-12-06 09:56:28.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:28 localhost nova_compute[237281]: 2025-12-06 09:56:28.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:56:28 localhost nova_compute[237281]: 2025-12-06 09:56:28.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:56:29 localhost nova_compute[237281]: 2025-12-06 09:56:29.812 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:56:29 localhost nova_compute[237281]: 2025-12-06 09:56:29.813 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:56:29 localhost nova_compute[237281]: 2025-12-06 09:56:29.813 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:56:29 localhost nova_compute[237281]: 2025-12-06 09:56:29.814 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:56:30 localhost nova_compute[237281]: 2025-12-06 09:56:30.043 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7454 DF PROTO=TCP SPT=56280 DPT=9102 SEQ=570017731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC5B1070000000001030307) Dec 6 04:56:31 localhost nova_compute[237281]: 2025-12-06 09:56:31.579 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:31 localhost nova_compute[237281]: 2025-12-06 09:56:31.987 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.010 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.011 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.011 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.012 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.012 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.037 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.037 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.038 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.038 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.110 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.182 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.184 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.255 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.257 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.326 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.327 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.382 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.585 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.587 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12714MB free_disk=387.31049728393555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.587 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.588 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.711 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.712 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.712 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.793 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.859 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.860 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.884 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.914 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.954 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.967 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.970 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:56:32 localhost nova_compute[237281]: 2025-12-06 09:56:32.971 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:56:33 localhost nova_compute[237281]: 2025-12-06 09:56:33.966 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:56:35 localhost nova_compute[237281]: 2025-12-06 09:56:35.073 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:56:35 localhost podman[239399]: 2025-12-06 09:56:35.549352077 +0000 UTC m=+0.077796605 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9) Dec 6 04:56:35 localhost podman[239399]: 2025-12-06 09:56:35.586880247 +0000 UTC m=+0.115324815 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public) Dec 6 04:56:35 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:56:36 localhost nova_compute[237281]: 2025-12-06 09:56:36.582 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:56:37 localhost podman[239419]: 2025-12-06 09:56:37.546873239 +0000 UTC m=+0.079567637 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:56:37 localhost podman[239419]: 2025-12-06 09:56:37.552793397 +0000 UTC m=+0.085487755 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:56:37 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:56:39 localhost sshd[239442]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:56:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7455 DF PROTO=TCP SPT=56280 DPT=9102 SEQ=570017731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC5D1870000000001030307) Dec 6 04:56:40 localhost nova_compute[237281]: 2025-12-06 09:56:40.114 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:41 localhost nova_compute[237281]: 2025-12-06 09:56:41.586 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:45 localhost nova_compute[237281]: 2025-12-06 09:56:45.160 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:46 localhost openstack_network_exporter[199751]: ERROR 09:56:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:56:46 localhost openstack_network_exporter[199751]: ERROR 09:56:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:56:46 localhost openstack_network_exporter[199751]: ERROR 09:56:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:56:46 localhost openstack_network_exporter[199751]: ERROR 09:56:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:56:46 localhost openstack_network_exporter[199751]: Dec 6 04:56:46 localhost openstack_network_exporter[199751]: ERROR 09:56:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:56:46 localhost openstack_network_exporter[199751]: Dec 6 04:56:46 localhost nova_compute[237281]: 2025-12-06 09:56:46.589 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:56:48 localhost podman[239444]: 2025-12-06 09:56:48.551543144 +0000 UTC m=+0.079806934 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 04:56:48 localhost podman[239444]: 2025-12-06 09:56:48.631566575 +0000 UTC m=+0.159830415 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 04:56:48 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:56:50 localhost nova_compute[237281]: 2025-12-06 09:56:50.199 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:56:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:56:51 localhost podman[239470]: 2025-12-06 09:56:51.553910736 +0000 UTC m=+0.084664781 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:56:51 localhost podman[239470]: 2025-12-06 09:56:51.565314869 +0000 UTC m=+0.096068994 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:56:51 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:56:51 localhost nova_compute[237281]: 2025-12-06 09:56:51.591 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:51 localhost podman[239471]: 2025-12-06 09:56:51.663636511 +0000 UTC m=+0.191566241 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125) Dec 6 04:56:51 localhost podman[239471]: 2025-12-06 09:56:51.678191259 +0000 UTC m=+0.206120959 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:56:51 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:56:53 localhost podman[197801]: time="2025-12-06T09:56:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:56:53 localhost podman[197801]: @ - - [06/Dec/2025:09:56:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:56:53 localhost podman[197801]: @ - - [06/Dec/2025:09:56:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15454 "" "Go-http-client/1.1" Dec 6 04:56:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=773 DF PROTO=TCP SPT=35080 DPT=9102 SEQ=309003586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC60A7A0000000001030307) Dec 6 04:56:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=774 DF PROTO=TCP SPT=35080 DPT=9102 SEQ=309003586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC60E880000000001030307) Dec 6 04:56:55 localhost nova_compute[237281]: 2025-12-06 09:56:55.230 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7456 DF PROTO=TCP SPT=56280 DPT=9102 SEQ=570017731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC611870000000001030307) Dec 6 04:56:56 localhost nova_compute[237281]: 2025-12-06 09:56:56.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:56:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=775 DF PROTO=TCP SPT=35080 DPT=9102 SEQ=309003586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC616870000000001030307) Dec 6 04:56:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:56:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:56:57 localhost podman[239512]: 2025-12-06 09:56:57.546273504 +0000 UTC m=+0.080547347 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Dec 6 04:56:57 localhost podman[239513]: 2025-12-06 09:56:57.603400166 +0000 UTC m=+0.133328728 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Dec 6 04:56:57 localhost podman[239513]: 2025-12-06 09:56:57.619361716 +0000 UTC m=+0.149290318 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 04:56:57 localhost podman[239512]: 2025-12-06 09:56:57.628034167 +0000 UTC m=+0.162308000 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 04:56:57 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:56:57 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:56:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23044 DF PROTO=TCP SPT=58112 DPT=9102 SEQ=3458578661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC619870000000001030307) Dec 6 04:57:00 localhost nova_compute[237281]: 2025-12-06 09:57:00.310 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=776 DF PROTO=TCP SPT=35080 DPT=9102 SEQ=309003586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC626470000000001030307) Dec 6 04:57:01 localhost nova_compute[237281]: 2025-12-06 09:57:01.597 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:05 localhost nova_compute[237281]: 2025-12-06 09:57:05.354 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:57:06 localhost podman[239550]: 2025-12-06 09:57:06.54195391 +0000 UTC m=+0.077903567 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible) Dec 6 04:57:06 localhost podman[239550]: 2025-12-06 09:57:06.553294022 +0000 UTC m=+0.089243679 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.) Dec 6 04:57:06 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:57:06 localhost nova_compute[237281]: 2025-12-06 09:57:06.600 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:57:06.677 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:57:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:57:06.678 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:57:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:57:06.679 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:57:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:57:08 localhost systemd[1]: tmp-crun.D8JIL1.mount: Deactivated successfully. Dec 6 04:57:08 localhost podman[239570]: 2025-12-06 09:57:08.557504336 +0000 UTC m=+0.093875670 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:57:08 localhost podman[239570]: 2025-12-06 09:57:08.564810345 +0000 UTC m=+0.101181699 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 04:57:08 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:57:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=777 DF PROTO=TCP SPT=35080 DPT=9102 SEQ=309003586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC647890000000001030307) Dec 6 04:57:10 localhost nova_compute[237281]: 2025-12-06 09:57:10.357 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:11 localhost nova_compute[237281]: 2025-12-06 09:57:11.603 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:15 localhost nova_compute[237281]: 2025-12-06 09:57:15.359 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:16 localhost openstack_network_exporter[199751]: ERROR 09:57:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:57:16 localhost openstack_network_exporter[199751]: ERROR 09:57:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:57:16 localhost openstack_network_exporter[199751]: ERROR 09:57:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:57:16 localhost openstack_network_exporter[199751]: ERROR 09:57:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:57:16 localhost openstack_network_exporter[199751]: Dec 6 04:57:16 localhost openstack_network_exporter[199751]: ERROR 09:57:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:57:16 localhost openstack_network_exporter[199751]: Dec 6 04:57:16 localhost nova_compute[237281]: 2025-12-06 09:57:16.605 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:57:19 localhost podman[239594]: 2025-12-06 09:57:19.542286891 +0000 UTC m=+0.078832586 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 6 04:57:19 localhost podman[239594]: 2025-12-06 09:57:19.583232865 +0000 UTC m=+0.119778560 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller) Dec 6 04:57:19 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:57:20 localhost nova_compute[237281]: 2025-12-06 09:57:20.361 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:21 localhost nova_compute[237281]: 2025-12-06 09:57:21.607 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:57:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:57:22 localhost podman[239620]: 2025-12-06 09:57:22.546108896 +0000 UTC m=+0.079368921 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:57:22 localhost podman[239621]: 2025-12-06 09:57:22.606458575 +0000 UTC m=+0.134568045 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:57:22 localhost podman[239621]: 2025-12-06 09:57:22.616140126 +0000 UTC m=+0.144249626 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 04:57:22 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:57:22 localhost podman[239620]: 2025-12-06 09:57:22.629403766 +0000 UTC m=+0.162663821 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:57:22 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:57:23 localhost podman[197801]: time="2025-12-06T09:57:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:57:23 localhost podman[197801]: @ - - [06/Dec/2025:09:57:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:57:23 localhost podman[197801]: @ - - [06/Dec/2025:09:57:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15454 "" "Go-http-client/1.1" Dec 6 04:57:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22727 DF PROTO=TCP SPT=58810 DPT=9102 SEQ=1960051304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC67FA90000000001030307) Dec 6 04:57:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22728 DF PROTO=TCP SPT=58810 DPT=9102 SEQ=1960051304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC683C70000000001030307) Dec 6 04:57:25 localhost nova_compute[237281]: 2025-12-06 09:57:25.394 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=778 DF PROTO=TCP SPT=35080 DPT=9102 SEQ=309003586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC687870000000001030307) Dec 6 04:57:26 localhost nova_compute[237281]: 2025-12-06 09:57:26.610 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:26 localhost nova_compute[237281]: 2025-12-06 09:57:26.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:26 localhost nova_compute[237281]: 2025-12-06 09:57:26.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:57:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22729 DF PROTO=TCP SPT=58810 DPT=9102 SEQ=1960051304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC68BC70000000001030307) Dec 6 04:57:27 localhost nova_compute[237281]: 2025-12-06 09:57:27.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7457 DF PROTO=TCP SPT=56280 DPT=9102 SEQ=570017731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC68F880000000001030307) Dec 6 04:57:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:57:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:57:28 localhost podman[239661]: 2025-12-06 09:57:28.554505541 +0000 UTC m=+0.087430844 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 04:57:28 localhost podman[239661]: 2025-12-06 09:57:28.588355041 +0000 UTC m=+0.121280394 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 04:57:28 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:57:28 localhost podman[239662]: 2025-12-06 09:57:28.604950521 +0000 UTC m=+0.133407590 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:57:28 localhost podman[239662]: 2025-12-06 09:57:28.619301213 +0000 UTC m=+0.147758372 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 04:57:28 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:57:28 localhost nova_compute[237281]: 2025-12-06 09:57:28.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:28 localhost nova_compute[237281]: 2025-12-06 09:57:28.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:29 localhost nova_compute[237281]: 2025-12-06 09:57:29.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:30 localhost nova_compute[237281]: 2025-12-06 09:57:30.397 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:30 localhost nova_compute[237281]: 2025-12-06 09:57:30.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:30 localhost nova_compute[237281]: 2025-12-06 09:57:30.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:57:30 localhost nova_compute[237281]: 2025-12-06 09:57:30.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:57:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22730 DF PROTO=TCP SPT=58810 DPT=9102 SEQ=1960051304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC69B880000000001030307) Dec 6 04:57:31 localhost nova_compute[237281]: 2025-12-06 09:57:31.520 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:57:31 localhost nova_compute[237281]: 2025-12-06 09:57:31.520 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:57:31 localhost nova_compute[237281]: 2025-12-06 09:57:31.520 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:57:31 localhost nova_compute[237281]: 2025-12-06 09:57:31.521 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:57:31 localhost nova_compute[237281]: 2025-12-06 09:57:31.612 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.570 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.585 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.585 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.586 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.586 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.587 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.606 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.607 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.607 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.608 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.671 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.724 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.726 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.801 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.803 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.862 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.864 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:57:33 localhost nova_compute[237281]: 2025-12-06 09:57:33.918 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.101 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.102 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12687MB free_disk=387.31053161621094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.103 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.103 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.181 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.182 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.183 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.249 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.269 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.271 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:57:34 localhost nova_compute[237281]: 2025-12-06 09:57:34.272 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:57:35 localhost nova_compute[237281]: 2025-12-06 09:57:35.267 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:57:35 localhost nova_compute[237281]: 2025-12-06 09:57:35.399 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:36 localhost nova_compute[237281]: 2025-12-06 09:57:36.615 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:57:37 localhost systemd[1]: tmp-crun.Fj8Kh5.mount: Deactivated successfully. Dec 6 04:57:37 localhost podman[239711]: 2025-12-06 09:57:37.548962142 +0000 UTC m=+0.079588959 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, name=ubi9-minimal, architecture=x86_64) Dec 6 04:57:37 localhost podman[239711]: 2025-12-06 09:57:37.589208724 +0000 UTC m=+0.119835551 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 04:57:37 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:57:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22731 DF PROTO=TCP SPT=58810 DPT=9102 SEQ=1960051304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC6BB870000000001030307) Dec 6 04:57:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:57:39 localhost podman[239731]: 2025-12-06 09:57:39.544211465 +0000 UTC m=+0.080918608 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:57:39 localhost podman[239731]: 2025-12-06 09:57:39.550056921 +0000 UTC m=+0.086764094 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:57:39 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:57:40 localhost nova_compute[237281]: 2025-12-06 09:57:40.444 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:41 localhost nova_compute[237281]: 2025-12-06 09:57:41.617 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:45 localhost nova_compute[237281]: 2025-12-06 09:57:45.446 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:46 localhost openstack_network_exporter[199751]: ERROR 09:57:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:57:46 localhost openstack_network_exporter[199751]: ERROR 09:57:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:57:46 localhost openstack_network_exporter[199751]: ERROR 09:57:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:57:46 localhost openstack_network_exporter[199751]: ERROR 09:57:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:57:46 localhost openstack_network_exporter[199751]: Dec 6 04:57:46 localhost openstack_network_exporter[199751]: ERROR 09:57:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:57:46 localhost openstack_network_exporter[199751]: Dec 6 04:57:46 localhost nova_compute[237281]: 2025-12-06 09:57:46.620 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:50 localhost nova_compute[237281]: 2025-12-06 09:57:50.469 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:57:50 localhost podman[239754]: 2025-12-06 09:57:50.564531682 +0000 UTC m=+0.081208907 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 6 04:57:50 localhost podman[239754]: 2025-12-06 09:57:50.603499056 +0000 UTC m=+0.120176231 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 04:57:50 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:57:51 localhost nova_compute[237281]: 2025-12-06 09:57:51.623 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:53 localhost podman[197801]: time="2025-12-06T09:57:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:57:53 localhost podman[197801]: @ - - [06/Dec/2025:09:57:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:57:53 localhost podman[197801]: @ - - [06/Dec/2025:09:57:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15456 "" "Go-http-client/1.1" Dec 6 04:57:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:57:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:57:53 localhost podman[239780]: 2025-12-06 09:57:53.561665804 +0000 UTC m=+0.093794977 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:57:53 localhost podman[239780]: 2025-12-06 09:57:53.57115498 +0000 UTC m=+0.103284163 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:57:53 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:57:53 localhost podman[239781]: 2025-12-06 09:57:53.65081014 +0000 UTC m=+0.180345684 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 04:57:53 localhost podman[239781]: 2025-12-06 09:57:53.665203433 +0000 UTC m=+0.194738957 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 6 04:57:53 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:57:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35574 DF PROTO=TCP SPT=44618 DPT=9102 SEQ=2967852962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC6F4DA0000000001030307) Dec 6 04:57:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35575 DF PROTO=TCP SPT=44618 DPT=9102 SEQ=2967852962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC6F8E50000000001030307) Dec 6 04:57:55 localhost nova_compute[237281]: 2025-12-06 09:57:55.471 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22732 DF PROTO=TCP SPT=58810 DPT=9102 SEQ=1960051304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC6FB870000000001030307) Dec 6 04:57:56 localhost nova_compute[237281]: 2025-12-06 09:57:56.625 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:57:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35576 DF PROTO=TCP SPT=44618 DPT=9102 SEQ=2967852962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC700C70000000001030307) Dec 6 04:57:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=779 DF PROTO=TCP SPT=35080 DPT=9102 SEQ=309003586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC705870000000001030307) Dec 6 04:57:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:57:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:57:59 localhost systemd[1]: tmp-crun.umbHrh.mount: Deactivated successfully. Dec 6 04:57:59 localhost podman[239821]: 2025-12-06 09:57:59.581357977 +0000 UTC m=+0.114472809 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 04:57:59 localhost podman[239822]: 2025-12-06 09:57:59.616762943 +0000 UTC m=+0.146319709 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:57:59 localhost podman[239822]: 2025-12-06 09:57:59.631224638 +0000 UTC m=+0.160781444 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 6 04:57:59 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:57:59 localhost podman[239821]: 2025-12-06 09:57:59.715449815 +0000 UTC m=+0.248564697 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent) Dec 6 04:57:59 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:58:00 localhost nova_compute[237281]: 2025-12-06 09:58:00.517 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35577 DF PROTO=TCP SPT=44618 DPT=9102 SEQ=2967852962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC710870000000001030307) Dec 6 04:58:01 localhost nova_compute[237281]: 2025-12-06 09:58:01.627 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:05 localhost nova_compute[237281]: 2025-12-06 09:58:05.521 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:06 localhost nova_compute[237281]: 2025-12-06 09:58:06.632 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:58:06.678 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:58:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:58:06.679 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:58:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:58:06.680 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:58:08 localhost podman[239858]: 2025-12-06 09:58:08.552706453 +0000 UTC m=+0.085036282 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 04:58:08 localhost podman[239858]: 2025-12-06 09:58:08.568232392 +0000 UTC m=+0.100562201 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 04:58:08 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:58:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35578 DF PROTO=TCP SPT=44618 DPT=9102 SEQ=2967852962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC731870000000001030307) Dec 6 04:58:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:58:10 localhost nova_compute[237281]: 2025-12-06 09:58:10.557 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:10 localhost podman[239879]: 2025-12-06 09:58:10.570766423 +0000 UTC m=+0.103715195 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:58:10 localhost podman[239879]: 2025-12-06 09:58:10.576028792 +0000 UTC m=+0.108977604 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:58:10 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:58:11 localhost nova_compute[237281]: 2025-12-06 09:58:11.650 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:15 localhost nova_compute[237281]: 2025-12-06 09:58:15.559 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:16 localhost openstack_network_exporter[199751]: ERROR 09:58:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:58:16 localhost openstack_network_exporter[199751]: ERROR 09:58:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:58:16 localhost openstack_network_exporter[199751]: ERROR 09:58:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:58:16 localhost openstack_network_exporter[199751]: ERROR 09:58:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:58:16 localhost openstack_network_exporter[199751]: Dec 6 04:58:16 localhost openstack_network_exporter[199751]: ERROR 09:58:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:58:16 localhost openstack_network_exporter[199751]: Dec 6 04:58:16 localhost nova_compute[237281]: 2025-12-06 09:58:16.653 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:20 localhost nova_compute[237281]: 2025-12-06 09:58:20.592 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:58:21 localhost podman[239903]: 2025-12-06 09:58:21.543694531 +0000 UTC m=+0.079416653 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Dec 6 04:58:21 localhost podman[239903]: 2025-12-06 09:58:21.611471844 +0000 UTC m=+0.147194006 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 04:58:21 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:58:21 localhost nova_compute[237281]: 2025-12-06 09:58:21.677 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.986 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.989 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70b73e65-11ac-4466-a6a9-a46d7838d85d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:22.987360', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19dfbe94-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': '4c596a2c807f949b8ef14c047692abcfffb60919faa63dd888ed1af933a7db79'}]}, 'timestamp': '2025-12-06 09:58:22.990092', '_unique_id': '8a136cac6da64121b4b333c95bdee904'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.991 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:22.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.023 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.023 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '188a3033-38c4-4ced-8b49-25a11112d1b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:22.992197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19e4d7bc-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': 'f9d75fa5375c04bae7f7f3916adc0ed21d119e20d58392a71fecadc082544687'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:22.992197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19e4dfdc-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': '6e713ccfe8b2d4cb37ba61cad6fe87223270b1ceb6895c04559cae0236eebe77'}]}, 'timestamp': '2025-12-06 09:58:23.023605', '_unique_id': '59d4677522da4dda98f1fbf6f49612da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.024 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98aaf51a-cd2b-4f4e-afab-309c60d2ae4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.024720', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19e51330-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': '6e6bd115aed4dc36871a0c52e55839af171e65d5f913254fc7f0a8fa98ce4f49'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.024720', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19e51b46-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': '77bdc5d0cd766dee02e9f092f232f854d4d05832191a5235b33a7863b46cd8d8'}]}, 'timestamp': '2025-12-06 09:58:23.025123', '_unique_id': '44a0c17f16a34620a48dd8a99d6fb880'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.025 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.026 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.043 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '376ddca8-f93d-40ec-a4a8-267b00f0b53a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:58:23.026154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '19e802d4-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.253965013, 'message_signature': '5d6b6d1af5ef08368cda7a11c64bbb3ec0d2a09685f97371d6b52fdebbea670a'}]}, 'timestamp': '2025-12-06 09:58:23.044169', '_unique_id': 'aab541659b0f47268b7ed0859be068af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.044 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08a7c935-ffe8-4c57-9b8a-cc71e754b097', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.045139', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19e830b0-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': 'f7066750dd432b21bcfacd2fadb6a8e338d70cc9dc1d32761b2caf8e3b193b6d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.045139', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19e837b8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': 'be5a1dfaeb6eda9879cb20496e2d33e9695e7a606254cb2eb2fb6d4f7798aeb6'}]}, 'timestamp': '2025-12-06 09:58:23.045510', '_unique_id': '4bb33410f8234355a75c77814f172d3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.046 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e285d416-b1df-4f3e-9181-d5bd0d9f05a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.046456', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19e86436-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': 'ba7d55d3a58b06a7be2e699ce603a28b2c15a4eaa2186457bda34a372faad740'}]}, 'timestamp': '2025-12-06 09:58:23.046662', '_unique_id': '29475418d64241748da9bce90277628d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.047 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66b3e93f-62a9-457e-b335-e1fbc1616bc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.047945', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19e89fdc-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': '481699394e873d279f8ff773d2ff7b68bc8914efdfebc95f7b0c7c912e6e0dcb'}]}, 'timestamp': '2025-12-06 09:58:23.048233', '_unique_id': '92d7b598f8f444048c660f5eefe71f83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.048 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.049 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.062 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a6cf650-5ecf-492b-8c3c-cfece674d447', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.049715', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19eae396-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.259799748, 'message_signature': 'ae1810258c35eb087bf1d80f9f11704a2221c612ba72a599c6ef4dc8a0e8b83b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.049715', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19eaef94-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.259799748, 'message_signature': 'defd41a9c56ddb6c6bff844b4badfd179920809d4ac14507d1f9c1a6da614b37'}]}, 'timestamp': '2025-12-06 09:58:23.063363', '_unique_id': '659131cf1b02401aab425d748cdd9b32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.063 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.064 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.064 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98c69199-d8e9-4f1e-b59c-0e03f4a960a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.064718', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19eb312a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': '1bb0a32ad2d1897d446c7a8db77ed49d6ae6bfb6aeb803882088e6227fbde58d'}]}, 'timestamp': '2025-12-06 09:58:23.065059', '_unique_id': 'bd1c8b21651f4ac2be9b0fc7d328ea46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.066 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.066 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '261d11f1-296b-458c-ba29-4f7d0e69db90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.066367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19eb6f78-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': 'da480e2a4969740f0f913102767741a4f8c9159b50feb91c8d3596d4e8d17df9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.066367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19eb7996-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': 'ead410f6b19518f1511a4fadd4d46e44b9666203d04fca9038f73265adc1926f'}]}, 'timestamp': '2025-12-06 09:58:23.066912', '_unique_id': '19fc30fbe3d04f7f87ac6c191e4a462f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.067 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.068 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.068 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 11770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e964b3c-f951-44c9-b958-f5d565ffdedb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11770000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T09:58:23.068264', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '19ebb9a6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.253965013, 'message_signature': '6dfe22b81c6fe30ed0228a034e4fb79f0e28da4b6e6d230338dae505cb22f894'}]}, 'timestamp': '2025-12-06 09:58:23.068542', '_unique_id': '385f7e2fb46748c5bb88355ac80cd62f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.069 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '956bd02d-d916-43f4-81e6-3cf572459912', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.069916', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19ebfa74-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': '47e531e8919ef14b98ae39f1bfcedc46c12e6e0960d790ecc298eb9b885175da'}]}, 'timestamp': '2025-12-06 09:58:23.070217', '_unique_id': '506ac4bfac4b455a9c537d72c10cd486'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.070 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.071 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.071 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.071 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9fa2940-189e-44ec-9fda-b2619f810d11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.071608', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19ec3c50-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.259799748, 'message_signature': 'eabf9b9c8485492eefc240f44165f119d8e7270200435d04e0e390f16cbd82b7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.071608', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19ec4786-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.259799748, 'message_signature': 'f49643da5dbf1a61edb3f5807041a1e56a629013ff848d5d374785c7964238fc'}]}, 'timestamp': '2025-12-06 09:58:23.072165', '_unique_id': '96abdb89c2f941cc8c78dffb24b7c581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.072 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.073 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a944a202-c124-42d1-a34e-ece63e549720', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.073511', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19ec871e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': 'aa1526004595e347e2fdd89f0375be7b6d4bf0ce9fdf19acd89216fbfbf567ff'}]}, 'timestamp': '2025-12-06 09:58:23.073811', '_unique_id': 'afeda8b14ac94001aae66c4db9b4ad47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.075 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2616841-48f1-4f41-95f8-5a36052568aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.075169', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19ecc76a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': 'afef6c98280faa2c38bb0f4cc90fd4ea0c669daeb7f77cf43183d255e05ce93f'}]}, 'timestamp': '2025-12-06 09:58:23.075457', '_unique_id': 'fb8ab2a23b6a4cb28c4100f50f11bf06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b995a0d7-038e-4de7-878e-6e5b77d64c8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.076786', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19ed07ac-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': '67d1038ea10ab9a6878f9e456fecf1e453deacbeca5a5144d56c743b77ff2fa3'}]}, 'timestamp': '2025-12-06 09:58:23.077108', '_unique_id': '848e3a3f2ee547ed8008a4894df811ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.078 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97f91728-1614-44fd-87fe-d2803090bf36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.078481', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19ed4a3c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': 'c444543c4666c186b62f52aa2af560f8566ed5152bc90c5926b5f593d568d78a'}]}, 'timestamp': '2025-12-06 09:58:23.078895', '_unique_id': '8200397f8f544045a63607781d8b7963'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.079 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb16bde3-0ad1-4c4f-b3c4-bea34492a8c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.080343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19ed9140-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': 'fdd74b086768cf0e5d7a4c9844867d1c5d69be5f629561d8372f855f212ec5e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.080343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19ed9c3a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': '40e924644950239e5c03a18e8d43192f5a83e7f66a65f8d0cd0fda7dc7b30ef2'}]}, 'timestamp': '2025-12-06 09:58:23.080924', '_unique_id': '64e77701a4f94f4c93241f2d7df30b34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.082 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.082 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10c43dcd-f8f3-4306-b801-a78d16df8dbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.082422', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19ede29e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.259799748, 'message_signature': '6cf197e4f8909758906bfb96f9271adef6fe126266fa86b754e0be02da51b383'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.082422', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19edf19e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.259799748, 'message_signature': '6b7664d296d6b5b427079e41115335e60c09462d7c1d070177539ba9bca90b45'}]}, 'timestamp': '2025-12-06 09:58:23.083099', '_unique_id': 'e8525eb8a1c7418fa73708aefcf5f041'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.084 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b76f2f0-e6a4-48b9-9fcf-ed74319dcd7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T09:58:23.084442', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '19ee31ae-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.19744486, 'message_signature': 'ff3b3710062faba52696cffec4db7659edf750e98c12ed4272e121ef73b6af0f'}]}, 'timestamp': '2025-12-06 09:58:23.084790', '_unique_id': 'd913b708506e4428b00faf59eabec1d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.086 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.086 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.086 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51dcc9b8-8c38-404a-9529-b086e0bf2700', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T09:58:23.086360', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19ee7d3a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': '6613c788bc8a3c2fa841babf246fe7a428fdfffb354e9a18ce2c06647e7224ca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T09:58:23.086360', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19ee8708-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11490.202276485, 'message_signature': '48f83622c38449b6eadd30334604897f957dbe80138eef991d5eef772f253369'}]}, 'timestamp': '2025-12-06 09:58:23.086918', '_unique_id': '98bd840cd66c412bbcfed12c8fe2097f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging yield Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 04:58:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 09:58:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 04:58:23 localhost podman[197801]: time="2025-12-06T09:58:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:58:23 localhost podman[197801]: @ - - [06/Dec/2025:09:58:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:58:23 localhost podman[197801]: @ - - [06/Dec/2025:09:58:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15457 "" "Go-http-client/1.1" Dec 6 04:58:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17348 DF PROTO=TCP SPT=50868 DPT=9102 SEQ=1246624732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC76A090000000001030307) Dec 6 04:58:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:58:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:58:24 localhost podman[239929]: 2025-12-06 09:58:24.551331571 +0000 UTC m=+0.084545827 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 04:58:24 localhost podman[239929]: 2025-12-06 09:58:24.562272831 +0000 UTC m=+0.095487097 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 04:58:24 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:58:24 localhost podman[239930]: 2025-12-06 09:58:24.616394141 +0000 UTC m=+0.142624407 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 04:58:24 localhost podman[239930]: 2025-12-06 09:58:24.631515007 +0000 UTC m=+0.157745243 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125) Dec 6 04:58:24 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:58:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17349 DF PROTO=TCP SPT=50868 DPT=9102 SEQ=1246624732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC76E080000000001030307) Dec 6 04:58:25 localhost nova_compute[237281]: 2025-12-06 09:58:25.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35579 DF PROTO=TCP SPT=44618 DPT=9102 SEQ=2967852962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC771880000000001030307) Dec 6 04:58:26 localhost nova_compute[237281]: 2025-12-06 09:58:26.680 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17350 DF PROTO=TCP SPT=50868 DPT=9102 SEQ=1246624732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC776070000000001030307) Dec 6 04:58:27 localhost nova_compute[237281]: 2025-12-06 09:58:27.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:27 localhost nova_compute[237281]: 2025-12-06 09:58:27.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:58:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22733 DF PROTO=TCP SPT=58810 DPT=9102 SEQ=1960051304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC779870000000001030307) Dec 6 04:58:29 localhost nova_compute[237281]: 2025-12-06 09:58:29.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:58:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:58:30 localhost podman[239971]: 2025-12-06 09:58:30.555938871 +0000 UTC m=+0.087491677 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Dec 6 04:58:30 localhost podman[239971]: 2025-12-06 09:58:30.565744977 +0000 UTC m=+0.097297793 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 04:58:30 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:58:30 localhost nova_compute[237281]: 2025-12-06 09:58:30.638 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:30 localhost podman[239972]: 2025-12-06 09:58:30.67243647 +0000 UTC m=+0.200413708 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:58:30 localhost podman[239972]: 2025-12-06 09:58:30.708458236 +0000 UTC m=+0.236435514 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm) Dec 6 04:58:30 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:58:30 localhost nova_compute[237281]: 2025-12-06 09:58:30.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:30 localhost nova_compute[237281]: 2025-12-06 09:58:30.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:58:30 localhost nova_compute[237281]: 2025-12-06 09:58:30.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:58:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17351 DF PROTO=TCP SPT=50868 DPT=9102 SEQ=1246624732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC785C70000000001030307) Dec 6 04:58:31 localhost nova_compute[237281]: 2025-12-06 09:58:31.575 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:58:31 localhost nova_compute[237281]: 2025-12-06 09:58:31.576 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:58:31 localhost nova_compute[237281]: 2025-12-06 09:58:31.576 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:58:31 localhost nova_compute[237281]: 2025-12-06 09:58:31.577 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:58:31 localhost nova_compute[237281]: 2025-12-06 09:58:31.714 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:33 localhost nova_compute[237281]: 2025-12-06 09:58:33.090 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:58:33 localhost nova_compute[237281]: 2025-12-06 09:58:33.106 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:58:33 localhost nova_compute[237281]: 2025-12-06 09:58:33.107 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:58:33 localhost nova_compute[237281]: 2025-12-06 09:58:33.108 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:33 localhost nova_compute[237281]: 2025-12-06 09:58:33.108 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:33 localhost nova_compute[237281]: 2025-12-06 09:58:33.109 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:33 localhost nova_compute[237281]: 2025-12-06 09:58:33.110 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:34 localhost nova_compute[237281]: 2025-12-06 09:58:34.104 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:34 localhost nova_compute[237281]: 2025-12-06 09:58:34.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:58:34 localhost nova_compute[237281]: 2025-12-06 09:58:34.906 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:58:34 localhost nova_compute[237281]: 2025-12-06 09:58:34.906 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:58:34 localhost nova_compute[237281]: 2025-12-06 09:58:34.907 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:58:34 localhost nova_compute[237281]: 2025-12-06 09:58:34.907 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.029 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.106 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.107 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.182 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.183 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.237 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.239 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.279 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.473 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.475 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12674MB free_disk=387.31069564819336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.475 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.475 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.545 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.546 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.546 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.594 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.609 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.611 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.611 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:58:35 localhost nova_compute[237281]: 2025-12-06 09:58:35.641 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:36 localhost nova_compute[237281]: 2025-12-06 09:58:36.717 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17352 DF PROTO=TCP SPT=50868 DPT=9102 SEQ=1246624732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC7A5870000000001030307) Dec 6 04:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:58:39 localhost podman[240019]: 2025-12-06 09:58:39.542427333 +0000 UTC m=+0.079343071 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 04:58:39 localhost podman[240019]: 2025-12-06 09:58:39.554558269 +0000 UTC m=+0.091473957 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 04:58:39 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:58:40 localhost nova_compute[237281]: 2025-12-06 09:58:40.643 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:58:41 localhost podman[240042]: 2025-12-06 09:58:41.565166885 +0000 UTC m=+0.092469957 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:58:41 localhost podman[240042]: 2025-12-06 09:58:41.5762907 +0000 UTC m=+0.103593752 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 04:58:41 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:58:41 localhost nova_compute[237281]: 2025-12-06 09:58:41.720 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:45 localhost nova_compute[237281]: 2025-12-06 09:58:45.678 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:46 localhost sshd[240065]: main: sshd: ssh-rsa algorithm is disabled Dec 6 04:58:46 localhost openstack_network_exporter[199751]: ERROR 09:58:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:58:46 localhost openstack_network_exporter[199751]: ERROR 09:58:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:58:46 localhost openstack_network_exporter[199751]: ERROR 09:58:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:58:46 localhost openstack_network_exporter[199751]: ERROR 09:58:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:58:46 localhost openstack_network_exporter[199751]: Dec 6 04:58:46 localhost openstack_network_exporter[199751]: ERROR 09:58:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:58:46 localhost openstack_network_exporter[199751]: Dec 6 04:58:46 localhost nova_compute[237281]: 2025-12-06 09:58:46.759 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:50 localhost nova_compute[237281]: 2025-12-06 09:58:50.680 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:51 localhost nova_compute[237281]: 2025-12-06 09:58:51.763 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:58:52 localhost podman[240067]: 2025-12-06 09:58:52.574385747 +0000 UTC m=+0.109890371 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible) Dec 6 04:58:52 localhost podman[240067]: 2025-12-06 09:58:52.615285789 +0000 UTC m=+0.150790423 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 04:58:52 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:58:53 localhost podman[197801]: time="2025-12-06T09:58:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:58:53 localhost podman[197801]: @ - - [06/Dec/2025:09:58:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:58:53 localhost podman[197801]: @ - - [06/Dec/2025:09:58:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15455 "" "Go-http-client/1.1" Dec 6 04:58:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57567 DF PROTO=TCP SPT=35776 DPT=9102 SEQ=3029299357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC7DF390000000001030307) Dec 6 04:58:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57568 DF PROTO=TCP SPT=35776 DPT=9102 SEQ=3029299357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC7E3480000000001030307) Dec 6 04:58:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:58:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:58:55 localhost podman[240092]: 2025-12-06 09:58:55.55793312 +0000 UTC m=+0.086666012 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:58:55 localhost podman[240092]: 2025-12-06 09:58:55.564926631 +0000 UTC m=+0.093659523 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:58:55 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:58:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17353 DF PROTO=TCP SPT=50868 DPT=9102 SEQ=1246624732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC7E5870000000001030307) Dec 6 04:58:55 localhost podman[240093]: 2025-12-06 09:58:55.61704093 +0000 UTC m=+0.142745261 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd) Dec 6 04:58:55 localhost podman[240093]: 2025-12-06 09:58:55.625774283 +0000 UTC m=+0.151478644 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.license=GPLv2) Dec 6 04:58:55 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:58:55 localhost nova_compute[237281]: 2025-12-06 09:58:55.707 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:56 localhost nova_compute[237281]: 2025-12-06 09:58:56.791 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:58:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57569 DF PROTO=TCP SPT=35776 DPT=9102 SEQ=3029299357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC7EB480000000001030307) Dec 6 04:58:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35580 DF PROTO=TCP SPT=44618 DPT=9102 SEQ=2967852962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC7EF880000000001030307) Dec 6 04:59:00 localhost nova_compute[237281]: 2025-12-06 09:59:00.709 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57570 DF PROTO=TCP SPT=35776 DPT=9102 SEQ=3029299357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC7FB070000000001030307) Dec 6 04:59:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:59:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:59:01 localhost podman[240137]: 2025-12-06 09:59:01.551586117 +0000 UTC m=+0.080896538 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute) Dec 6 04:59:01 localhost podman[240136]: 2025-12-06 09:59:01.597884931 +0000 UTC m=+0.130051108 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 04:59:01 localhost podman[240137]: 2025-12-06 09:59:01.615986027 +0000 UTC m=+0.145296498 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 04:59:01 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:59:01 localhost podman[240136]: 2025-12-06 09:59:01.630917727 +0000 UTC m=+0.163083944 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 04:59:01 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:59:01 localhost nova_compute[237281]: 2025-12-06 09:59:01.794 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:05 localhost nova_compute[237281]: 2025-12-06 09:59:05.755 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:59:06.679 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:59:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:59:06.679 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:59:06 localhost ovn_metadata_agent[137254]: 2025-12-06 09:59:06.680 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:59:06 localhost nova_compute[237281]: 2025-12-06 09:59:06.824 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57571 DF PROTO=TCP SPT=35776 DPT=9102 SEQ=3029299357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC81B870000000001030307) Dec 6 04:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:59:10 localhost systemd[1]: tmp-crun.glibmJ.mount: Deactivated successfully. Dec 6 04:59:10 localhost podman[240170]: 2025-12-06 09:59:10.563333611 +0000 UTC m=+0.097523749 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm) Dec 6 04:59:10 localhost podman[240170]: 2025-12-06 09:59:10.576501607 +0000 UTC m=+0.110691745 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 04:59:10 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:59:10 localhost nova_compute[237281]: 2025-12-06 09:59:10.758 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:11 localhost nova_compute[237281]: 2025-12-06 09:59:11.826 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:59:12 localhost podman[240190]: 2025-12-06 09:59:12.55448069 +0000 UTC m=+0.085485856 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:59:12 localhost podman[240190]: 2025-12-06 09:59:12.567323687 +0000 UTC m=+0.098328893 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 04:59:12 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:59:15 localhost nova_compute[237281]: 2025-12-06 09:59:15.800 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:16 localhost openstack_network_exporter[199751]: ERROR 09:59:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:59:16 localhost openstack_network_exporter[199751]: ERROR 09:59:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:59:16 localhost openstack_network_exporter[199751]: ERROR 09:59:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:59:16 localhost openstack_network_exporter[199751]: ERROR 09:59:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:59:16 localhost openstack_network_exporter[199751]: Dec 6 04:59:16 localhost openstack_network_exporter[199751]: ERROR 09:59:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:59:16 localhost openstack_network_exporter[199751]: Dec 6 04:59:16 localhost nova_compute[237281]: 2025-12-06 09:59:16.829 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:20 localhost nova_compute[237281]: 2025-12-06 09:59:20.802 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:21 localhost nova_compute[237281]: 2025-12-06 09:59:21.832 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:23 localhost podman[197801]: time="2025-12-06T09:59:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:59:23 localhost podman[197801]: @ - - [06/Dec/2025:09:59:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:59:23 localhost podman[197801]: @ - - [06/Dec/2025:09:59:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15461 "" "Go-http-client/1.1" Dec 6 04:59:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:59:23 localhost podman[240214]: 2025-12-06 09:59:23.548756332 +0000 UTC m=+0.081694952 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible) Dec 6 04:59:23 localhost podman[240214]: 2025-12-06 09:59:23.592408877 +0000 UTC m=+0.125347477 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 04:59:23 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:59:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61895 DF PROTO=TCP SPT=59774 DPT=9102 SEQ=3130240063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC8546A0000000001030307) Dec 6 04:59:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61896 DF PROTO=TCP SPT=59774 DPT=9102 SEQ=3130240063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC858870000000001030307) Dec 6 04:59:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57572 DF PROTO=TCP SPT=35776 DPT=9102 SEQ=3029299357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC85B880000000001030307) Dec 6 04:59:25 localhost nova_compute[237281]: 2025-12-06 09:59:25.836 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:59:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:59:26 localhost podman[240241]: 2025-12-06 09:59:26.553881306 +0000 UTC m=+0.089117136 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 04:59:26 localhost podman[240241]: 2025-12-06 09:59:26.563189656 +0000 UTC m=+0.098425456 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:59:26 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:59:26 localhost podman[240242]: 2025-12-06 09:59:26.60550326 +0000 UTC m=+0.136301357 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 04:59:26 localhost podman[240242]: 2025-12-06 09:59:26.621267826 +0000 UTC m=+0.152066003 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0) Dec 6 04:59:26 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:59:26 localhost nova_compute[237281]: 2025-12-06 09:59:26.834 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61897 DF PROTO=TCP SPT=59774 DPT=9102 SEQ=3130240063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC860870000000001030307) Dec 6 04:59:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17354 DF PROTO=TCP SPT=50868 DPT=9102 SEQ=1246624732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC863880000000001030307) Dec 6 04:59:28 localhost nova_compute[237281]: 2025-12-06 09:59:28.611 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:28 localhost nova_compute[237281]: 2025-12-06 09:59:28.612 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 04:59:28 localhost nova_compute[237281]: 2025-12-06 09:59:28.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:29 localhost nova_compute[237281]: 2025-12-06 09:59:29.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:30 localhost nova_compute[237281]: 2025-12-06 09:59:30.840 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:30 localhost nova_compute[237281]: 2025-12-06 09:59:30.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:30 localhost nova_compute[237281]: 2025-12-06 09:59:30.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 04:59:30 localhost nova_compute[237281]: 2025-12-06 09:59:30.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 04:59:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61898 DF PROTO=TCP SPT=59774 DPT=9102 SEQ=3130240063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC870480000000001030307) Dec 6 04:59:31 localhost nova_compute[237281]: 2025-12-06 09:59:31.526 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 04:59:31 localhost nova_compute[237281]: 2025-12-06 09:59:31.527 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 04:59:31 localhost nova_compute[237281]: 2025-12-06 09:59:31.527 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 04:59:31 localhost nova_compute[237281]: 2025-12-06 09:59:31.528 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 04:59:31 localhost nova_compute[237281]: 2025-12-06 09:59:31.836 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:33 localhost nova_compute[237281]: 2025-12-06 09:59:33.233 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 04:59:33 localhost nova_compute[237281]: 2025-12-06 09:59:33.250 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 04:59:33 localhost nova_compute[237281]: 2025-12-06 09:59:33.250 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 04:59:33 localhost nova_compute[237281]: 2025-12-06 09:59:33.251 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:33 localhost nova_compute[237281]: 2025-12-06 09:59:33.251 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:33 localhost nova_compute[237281]: 2025-12-06 09:59:33.252 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 04:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 04:59:33 localhost podman[240282]: 2025-12-06 09:59:33.695872024 +0000 UTC m=+0.073795414 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 04:59:33 localhost podman[240283]: 2025-12-06 09:59:33.766978636 +0000 UTC m=+0.140419891 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 04:59:33 localhost podman[240283]: 2025-12-06 09:59:33.778155923 +0000 UTC m=+0.151597108 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 04:59:33 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 04:59:33 localhost podman[240282]: 2025-12-06 09:59:33.789561057 +0000 UTC m=+0.167484527 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 04:59:33 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 04:59:34 localhost nova_compute[237281]: 2025-12-06 09:59:34.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:34 localhost nova_compute[237281]: 2025-12-06 09:59:34.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:35 localhost nova_compute[237281]: 2025-12-06 09:59:35.880 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:36 localhost nova_compute[237281]: 2025-12-06 09:59:36.839 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:36 localhost nova_compute[237281]: 2025-12-06 09:59:36.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 04:59:36 localhost nova_compute[237281]: 2025-12-06 09:59:36.912 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:59:36 localhost nova_compute[237281]: 2025-12-06 09:59:36.913 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:59:36 localhost nova_compute[237281]: 2025-12-06 09:59:36.913 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:59:36 localhost nova_compute[237281]: 2025-12-06 09:59:36.914 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 04:59:36 localhost nova_compute[237281]: 2025-12-06 09:59:36.975 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.049 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.051 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.125 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.126 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.200 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.202 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.274 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.517 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.519 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12685MB free_disk=387.31069564819336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.519 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.520 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.604 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.604 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.605 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.643 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.660 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.662 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 04:59:37 localhost nova_compute[237281]: 2025-12-06 09:59:37.663 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 04:59:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61899 DF PROTO=TCP SPT=59774 DPT=9102 SEQ=3130240063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC891870000000001030307) Dec 6 04:59:40 localhost nova_compute[237281]: 2025-12-06 09:59:40.884 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 04:59:41 localhost podman[240333]: 2025-12-06 09:59:41.556053238 +0000 UTC m=+0.084693372 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 04:59:41 localhost podman[240333]: 2025-12-06 09:59:41.571743161 +0000 UTC m=+0.100383305 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 04:59:41 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 04:59:41 localhost nova_compute[237281]: 2025-12-06 09:59:41.876 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 04:59:43 localhost podman[240354]: 2025-12-06 09:59:43.554738494 +0000 UTC m=+0.085638550 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:59:43 localhost podman[240354]: 2025-12-06 09:59:43.561973663 +0000 UTC m=+0.092873709 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 04:59:43 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 04:59:45 localhost nova_compute[237281]: 2025-12-06 09:59:45.886 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:46 localhost openstack_network_exporter[199751]: ERROR 09:59:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 04:59:46 localhost openstack_network_exporter[199751]: ERROR 09:59:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:59:46 localhost openstack_network_exporter[199751]: ERROR 09:59:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 04:59:46 localhost openstack_network_exporter[199751]: ERROR 09:59:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 04:59:46 localhost openstack_network_exporter[199751]: Dec 6 04:59:46 localhost openstack_network_exporter[199751]: ERROR 09:59:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 04:59:46 localhost openstack_network_exporter[199751]: Dec 6 04:59:46 localhost nova_compute[237281]: 2025-12-06 09:59:46.917 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:50 localhost nova_compute[237281]: 2025-12-06 09:59:50.888 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:51 localhost nova_compute[237281]: 2025-12-06 09:59:51.948 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:53 localhost podman[197801]: time="2025-12-06T09:59:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 04:59:53 localhost podman[197801]: @ - - [06/Dec/2025:09:59:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 04:59:53 localhost podman[197801]: @ - - [06/Dec/2025:09:59:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15458 "" "Go-http-client/1.1" Dec 6 04:59:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46939 DF PROTO=TCP SPT=52846 DPT=9102 SEQ=677795422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC8C9990000000001030307) Dec 6 04:59:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 04:59:54 localhost podman[240377]: 2025-12-06 09:59:54.553327546 +0000 UTC m=+0.084364783 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 04:59:54 localhost podman[240377]: 2025-12-06 09:59:54.591418383 +0000 UTC m=+0.122455630 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:59:54 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 04:59:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46940 DF PROTO=TCP SPT=52846 DPT=9102 SEQ=677795422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC8CD870000000001030307) Dec 6 04:59:55 localhost nova_compute[237281]: 2025-12-06 09:59:55.890 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61900 DF PROTO=TCP SPT=59774 DPT=9102 SEQ=3130240063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC8D1870000000001030307) Dec 6 04:59:56 localhost nova_compute[237281]: 2025-12-06 09:59:56.978 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 04:59:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46941 DF PROTO=TCP SPT=52846 DPT=9102 SEQ=677795422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC8D5870000000001030307) Dec 6 04:59:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 04:59:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 04:59:57 localhost podman[240403]: 2025-12-06 09:59:57.553230172 +0000 UTC m=+0.081984670 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 04:59:57 localhost podman[240403]: 2025-12-06 09:59:57.594393843 +0000 UTC m=+0.123148381 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 04:59:57 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 04:59:57 localhost podman[240402]: 2025-12-06 09:59:57.606474636 +0000 UTC m=+0.139530023 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 04:59:57 localhost podman[240402]: 2025-12-06 09:59:57.612225219 +0000 UTC m=+0.145280596 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 04:59:57 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 04:59:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57573 DF PROTO=TCP SPT=35776 DPT=9102 SEQ=3029299357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC8D9870000000001030307) Dec 6 05:00:00 localhost nova_compute[237281]: 2025-12-06 10:00:00.892 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46942 DF PROTO=TCP SPT=52846 DPT=9102 SEQ=677795422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC8E5470000000001030307) Dec 6 05:00:02 localhost nova_compute[237281]: 2025-12-06 10:00:02.004 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:00:04 localhost systemd[1]: tmp-crun.7GPFn1.mount: Deactivated successfully. Dec 6 05:00:04 localhost podman[240445]: 2025-12-06 10:00:04.550418368 +0000 UTC m=+0.079719972 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Dec 6 05:00:04 localhost podman[240445]: 2025-12-06 10:00:04.560220364 +0000 UTC m=+0.089521948 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Dec 6 05:00:04 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:00:04 localhost podman[240444]: 2025-12-06 10:00:04.655944247 +0000 UTC m=+0.187023345 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:00:04 localhost podman[240444]: 2025-12-06 10:00:04.666213726 +0000 UTC m=+0.197292864 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:00:04 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:00:05 localhost nova_compute[237281]: 2025-12-06 10:00:05.894 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:00:06.680 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:00:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:00:06.680 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:00:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:00:06.681 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:00:07 localhost nova_compute[237281]: 2025-12-06 10:00:07.054 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46943 DF PROTO=TCP SPT=52846 DPT=9102 SEQ=677795422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC905880000000001030307) Dec 6 05:00:10 localhost nova_compute[237281]: 2025-12-06 10:00:10.896 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:12 localhost nova_compute[237281]: 2025-12-06 10:00:12.090 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:00:12 localhost podman[240481]: 2025-12-06 10:00:12.562964461 +0000 UTC m=+0.094009442 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 6 05:00:12 localhost podman[240481]: 2025-12-06 10:00:12.575638653 +0000 UTC m=+0.106683644 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, release=1755695350, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 05:00:12 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:00:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:00:14 localhost podman[240502]: 2025-12-06 10:00:14.559659057 +0000 UTC m=+0.086145176 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:00:14 localhost podman[240502]: 2025-12-06 10:00:14.566667749 +0000 UTC m=+0.093153868 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:00:14 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:00:15 localhost nova_compute[237281]: 2025-12-06 10:00:15.949 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:16 localhost openstack_network_exporter[199751]: ERROR 10:00:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:00:16 localhost openstack_network_exporter[199751]: ERROR 10:00:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:00:16 localhost openstack_network_exporter[199751]: ERROR 10:00:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:00:16 localhost openstack_network_exporter[199751]: ERROR 10:00:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:00:16 localhost openstack_network_exporter[199751]: Dec 6 05:00:16 localhost openstack_network_exporter[199751]: ERROR 10:00:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:00:16 localhost openstack_network_exporter[199751]: Dec 6 05:00:17 localhost nova_compute[237281]: 2025-12-06 10:00:17.126 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:20 localhost nova_compute[237281]: 2025-12-06 10:00:20.989 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:22 localhost nova_compute[237281]: 2025-12-06 10:00:22.150 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:22.987 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:00:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:22.987 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.025 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.026 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd499791b-6c6d-488d-8982-a3f59df36933', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:22.988221', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '616bd072-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '979e54b6bd780386fa865f5b39092f692b26d4159fc90e7d7f3f101274b7d69f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:22.988221', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '616be53a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '266e8a2218d705a94096fe86e47b816f94d88c8ef7d138cecae8255a977a8051'}]}, 'timestamp': '2025-12-06 10:00:23.026885', '_unique_id': 'b2bf3b872772441c91d5565f0a7ebed2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.029 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.033 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84df3c1e-9003-428f-a6c0-80491e8ddc61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.029956', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '616cf902-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': '39d429243809483a40f63dc7f4615478b0c325d25c051df9b509c0b16e7fa775'}]}, 'timestamp': '2025-12-06 10:00:23.034007', '_unique_id': '492e6bf351034f3cbdba7b3552e91891'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.035 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.036 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19096ab1-4e3a-44fb-b0be-57844d031afc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.036739', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '616d7f44-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': '3a9fa8332b68895d698ebaa6a369265279e842baee45168f04edf2fc101f7d92'}]}, 'timestamp': '2025-12-06 10:00:23.037383', '_unique_id': '112899f704064302bbcc293e5dc7d6af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.038 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.040 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd27be889-3ce2-40e0-aa9a-cc70ba24419c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.040126', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '616dff6e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': '99d312fdec80dfe46f7fece59b6875b6f5c55335b22b05e977912a789ffc849d'}]}, 'timestamp': '2025-12-06 10:00:23.040740', '_unique_id': 'ed572e357cdd4a949d12c62328683b42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.043 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.043 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.043 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e21d2836-7eba-42bf-a645-1a9e564ade6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.043227', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '616e77e6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': 'c2fc58268f0bcb6980ae8bf73047ff9f36bf40bf139bd73575be966c72c5baaa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.043227', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '616e8c18-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '7c04ef6b24088cb6248a785a7bfe0930fd51bab589dec1173dc30133fe13412a'}]}, 'timestamp': '2025-12-06 10:00:23.044219', '_unique_id': '59618cc1436d4913a31dcc245a35333c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.046 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.060 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 12380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22227934-366a-44b4-b778-69a4a02841df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12380000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:00:23.046732', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '61712838-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.270790397, 'message_signature': 'ae91f4725d61d21af689d08f685fcacbc65e66de232516c49fbf5f6244fe53c9'}]}, 'timestamp': '2025-12-06 10:00:23.061346', '_unique_id': '70902157d6344ec1963125ee583967c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd20d0de7-6052-4354-b95a-d0a43c15b924', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.063789', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '61719c8c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': 'f74e4a77f8d584f76d128107f1bf36216fcdf686295e7249d560abb7b54749c0'}]}, 'timestamp': '2025-12-06 10:00:23.064327', '_unique_id': '54d4c796be2c4439b5848d3971971a3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.066 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.067 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68c56005-8af6-4f02-a0e8-9e887a155045', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.066882', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '617213ce-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '926e328b7fde29cdae70c079e20a4d1499816a6d5cd13b5cc29f51dd116459dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.066882', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '617224c2-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '8118b8b062cb8d922b9fe398e60606cfb9c61c2609ad51acf1cae7645fd7fb27'}]}, 'timestamp': '2025-12-06 10:00:23.067778', '_unique_id': 'bc057b4f097345cd8af53d69aed2b598'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.070 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.070 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.070 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efb6c545-699a-436d-b5f5-06583fc65459', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.070282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61729830-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '93cd3fd0aa28e0e6ee3be8393639a6f0d60aceddd122490f517cd199be5d4a12'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.070282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6172aaaa-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '331e97b5724229110cd825763314a6a3d110b3592d76c83001a364a16b425dc5'}]}, 'timestamp': '2025-12-06 10:00:23.071226', '_unique_id': 'fc33fcee373d43d7a00a0a7a2641ec47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.072 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.073 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.074 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c2f4f83-f620-48a7-9777-052943c05667', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.073564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '617318a0-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': 'bae8fa9c31bcfe77538f5a53a28878a93284a9451247aef9428c448957e9133b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.073564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61732b4c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '8d3cd3cdf541fc6ccbf93f554084a8f448a8d90d595aed28d9dcb332aedfeb6d'}]}, 'timestamp': '2025-12-06 10:00:23.074505', '_unique_id': '32fd3d1560fa4eafb977c1a48bd6f316'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.076 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f749861f-6894-4075-9521-89640d4a1d50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:00:23.076929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '61739c62-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.270790397, 'message_signature': 'e6099f03fe9acf29b8a24fe482455f91a55b15ef3da389896d4090c091c79e79'}]}, 'timestamp': '2025-12-06 10:00:23.077416', '_unique_id': 'd27ea58367f543c3b0d506e88ba8c859'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e2b6e81-e39a-4533-841e-dadf753dfe18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.079646', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '61740774-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': '5f0b6bec7690c1d6dc7c1bf1358d36555021f4fc4999ab5829cac5ab63bf2bb6'}]}, 'timestamp': '2025-12-06 10:00:23.080198', '_unique_id': 'b11a06f6bb8e4285a02df0ba5ec19c1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.082 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cf8713c-9598-455d-aa86-85c3a48201b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.082518', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6174768c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': 'eb146e144504bc2f98d48cdcde11c3e7b1d70b08a16eebb2c3980ace41010283'}]}, 'timestamp': '2025-12-06 10:00:23.083049', '_unique_id': '889d0c4043b64ed3ad8f78d809834270'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.099 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0812c3d2-b4c3-4f1a-a140-dccbf58df41c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.085307', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6177132e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.295425749, 'message_signature': '14457d7bc7ba34320de5a8e84d475891135d61311998e1177e787a323bd44be0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.085307', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6177295e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.295425749, 'message_signature': 'dff4e89e905e6adf9e0fa0cb85556900481abdc9509f9d950355e3c648516468'}]}, 'timestamp': '2025-12-06 10:00:23.100763', '_unique_id': 'fe0b0809d2a94467af5b563f16ca52f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '694d6949-3504-4cdd-bcd9-173c86196289', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.103544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6177b202-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.295425749, 'message_signature': '268c8c5f2ccd14db7b437098b26725690aed8781a826e2688d8a7dafa80b8bbb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.103544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6177cc1a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.295425749, 'message_signature': '8323cfbf5a9df2aa26f4193c6d441c0b4d10817e39b74f0c177a6d7aedd58a6b'}]}, 'timestamp': '2025-12-06 10:00:23.105027', '_unique_id': '547186c36ce54c279eba4bbf75ea95a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.108 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddf51e7a-1d8c-4da5-84fc-05c4f222be68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.108495', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '617874a8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': '9352ab837151d43f99264fab34259d0d58277088a0434dbeb24ec25401cac049'}]}, 'timestamp': '2025-12-06 10:00:23.109313', '_unique_id': '020947bf1c0845498a4508b389b3eb7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '341eb670-c0f0-447a-91dc-a297405f31c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.112648', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61791688-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.295425749, 'message_signature': 'e0e65703529a90285a6c0c4b9cec0a58569064daa6d0e27cc75e46d6db2a73ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.112648', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6179303c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.295425749, 'message_signature': 'd91316d64d51f1187e9a0effc24813350d117cb76d4ed9b01b49985e498494c6'}]}, 'timestamp': '2025-12-06 10:00:23.114103', '_unique_id': '925e75865cb941f9ba2758942fd2df27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05a7a397-3903-41b8-8e19-5ffc9970049f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.117526', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '6179d366-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': 'a05f94f9414316d7f880b0d6993eb12b9087ddd08bf4d4e19cb09ca21f435219'}]}, 'timestamp': '2025-12-06 10:00:23.118268', '_unique_id': 'a3019fe6cf2642bfa26c69a311dc6137'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '324fae94-6955-4a5f-85dc-d03d6ffd09c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:00:23.122349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '617a8e14-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': '48ea32f47cd3e9721cea7f92b50ce2cd527dc79236be8af54caf9e66c9045dab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:00:23.122349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '617aa7aa-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.198341825, 'message_signature': 'c46a3f51386e4b3a0a7f08f57257173ac783aa51c373d8a5f0274bd886a14e05'}]}, 'timestamp': '2025-12-06 10:00:23.123661', '_unique_id': 'e104fe00165e45ce90cee2f0e8d6b337'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfee8303-f6d4-4d4c-8e1a-1b46da3c3856', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.126887', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '617b4048-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': 'a6e788dc01e48999d5012ac2039b2c208d00d24f7e402c3932c4fed9e6cff138'}]}, 'timestamp': '2025-12-06 10:00:23.127604', '_unique_id': '0de39ec3e09b4bbdabce5342bd896e39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '968e18da-29ae-43e7-8efe-691a81441923', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:00:23.130781', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '617bd8c8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11610.240114033, 'message_signature': '2eb144265ccd8951c2460834c4632bbfe9a4ea14f4762afc6e8224dcf73049fb'}]}, 'timestamp': '2025-12-06 10:00:23.131507', '_unique_id': '16469614a10a46b5a4b1147dffdbb980'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:00:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:00:23.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:00:23 localhost podman[197801]: time="2025-12-06T10:00:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:00:23 localhost podman[197801]: @ - - [06/Dec/2025:10:00:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:00:23 localhost podman[197801]: @ - - [06/Dec/2025:10:00:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15468 "" "Go-http-client/1.1" Dec 6 05:00:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31354 DF PROTO=TCP SPT=49870 DPT=9102 SEQ=888968206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC93EC90000000001030307) Dec 6 05:00:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31355 DF PROTO=TCP SPT=49870 DPT=9102 SEQ=888968206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC942C70000000001030307) Dec 6 05:00:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:00:25 localhost systemd[1]: tmp-crun.NkS3yJ.mount: Deactivated successfully. Dec 6 05:00:25 localhost podman[240525]: 2025-12-06 10:00:25.549161387 +0000 UTC m=+0.082545988 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:00:25 localhost podman[240525]: 2025-12-06 10:00:25.592190613 +0000 UTC m=+0.125575254 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:00:25 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:00:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46944 DF PROTO=TCP SPT=52846 DPT=9102 SEQ=677795422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC945870000000001030307) Dec 6 05:00:26 localhost nova_compute[237281]: 2025-12-06 10:00:26.038 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31356 DF PROTO=TCP SPT=49870 DPT=9102 SEQ=888968206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC94AC80000000001030307) Dec 6 05:00:27 localhost nova_compute[237281]: 2025-12-06 10:00:27.199 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61901 DF PROTO=TCP SPT=59774 DPT=9102 SEQ=3130240063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC94F870000000001030307) Dec 6 05:00:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:00:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:00:28 localhost podman[240550]: 2025-12-06 10:00:28.537068563 +0000 UTC m=+0.069698530 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:00:28 localhost podman[240550]: 2025-12-06 10:00:28.543347982 +0000 UTC m=+0.075977939 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:00:28 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:00:28 localhost podman[240551]: 2025-12-06 10:00:28.61299237 +0000 UTC m=+0.140728101 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:00:28 localhost podman[240551]: 2025-12-06 10:00:28.621685812 +0000 UTC m=+0.149421533 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:00:28 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:00:28 localhost nova_compute[237281]: 2025-12-06 10:00:28.663 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:28 localhost nova_compute[237281]: 2025-12-06 10:00:28.663 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:00:31 localhost nova_compute[237281]: 2025-12-06 10:00:31.041 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31357 DF PROTO=TCP SPT=49870 DPT=9102 SEQ=888968206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC95A870000000001030307) Dec 6 05:00:31 localhost nova_compute[237281]: 2025-12-06 10:00:31.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:31 localhost nova_compute[237281]: 2025-12-06 10:00:31.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:00:31 localhost nova_compute[237281]: 2025-12-06 10:00:31.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:00:32 localhost nova_compute[237281]: 2025-12-06 10:00:32.202 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:32 localhost nova_compute[237281]: 2025-12-06 10:00:32.975 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:00:32 localhost nova_compute[237281]: 2025-12-06 10:00:32.976 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:00:32 localhost nova_compute[237281]: 2025-12-06 10:00:32.976 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:00:32 localhost nova_compute[237281]: 2025-12-06 10:00:32.977 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.006 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.023 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.024 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.025 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.025 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.025 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.026 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:00:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:00:35 localhost podman[240591]: 2025-12-06 10:00:35.574066516 +0000 UTC m=+0.098946192 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 05:00:35 localhost podman[240591]: 2025-12-06 10:00:35.579205694 +0000 UTC m=+0.104085420 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 05:00:35 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:00:35 localhost podman[240592]: 2025-12-06 10:00:35.542257725 +0000 UTC m=+0.068196993 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:00:35 localhost podman[240592]: 2025-12-06 10:00:35.621552349 +0000 UTC m=+0.147491597 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:00:35 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:35 localhost nova_compute[237281]: 2025-12-06 10:00:35.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:36 localhost nova_compute[237281]: 2025-12-06 10:00:36.076 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:37 localhost nova_compute[237281]: 2025-12-06 10:00:37.238 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:38 localhost nova_compute[237281]: 2025-12-06 10:00:38.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:00:38 localhost nova_compute[237281]: 2025-12-06 10:00:38.914 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:00:38 localhost nova_compute[237281]: 2025-12-06 10:00:38.914 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:00:38 localhost nova_compute[237281]: 2025-12-06 10:00:38.915 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:00:38 localhost nova_compute[237281]: 2025-12-06 10:00:38.915 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:00:38 localhost nova_compute[237281]: 2025-12-06 10:00:38.986 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.060 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.061 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.115 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.117 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.160 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.162 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.229 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.407 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.409 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12681MB free_disk=387.31069564819336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.410 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.411 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.506 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.507 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.508 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:00:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31358 DF PROTO=TCP SPT=49870 DPT=9102 SEQ=888968206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC97B870000000001030307) Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.556 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.572 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.575 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:00:39 localhost nova_compute[237281]: 2025-12-06 10:00:39.575 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:00:41 localhost nova_compute[237281]: 2025-12-06 10:00:41.100 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:42 localhost nova_compute[237281]: 2025-12-06 10:00:42.240 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:00:43 localhost podman[240642]: 2025-12-06 10:00:43.526383765 +0000 UTC m=+0.061432335 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41) Dec 6 05:00:43 localhost podman[240642]: 2025-12-06 10:00:43.563001633 +0000 UTC m=+0.098050203 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.6, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git) Dec 6 05:00:43 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:00:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:00:45 localhost podman[240663]: 2025-12-06 10:00:45.546435445 +0000 UTC m=+0.074369733 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:00:45 localhost podman[240663]: 2025-12-06 10:00:45.556191017 +0000 UTC m=+0.084125345 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:00:45 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:00:46 localhost nova_compute[237281]: 2025-12-06 10:00:46.144 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:46 localhost openstack_network_exporter[199751]: ERROR 10:00:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:00:46 localhost openstack_network_exporter[199751]: ERROR 10:00:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:00:46 localhost openstack_network_exporter[199751]: ERROR 10:00:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:00:46 localhost openstack_network_exporter[199751]: ERROR 10:00:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:00:46 localhost openstack_network_exporter[199751]: Dec 6 05:00:46 localhost openstack_network_exporter[199751]: ERROR 10:00:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:00:46 localhost openstack_network_exporter[199751]: Dec 6 05:00:47 localhost nova_compute[237281]: 2025-12-06 10:00:47.266 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:48 localhost sshd[240687]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:00:51 localhost nova_compute[237281]: 2025-12-06 10:00:51.192 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:52 localhost nova_compute[237281]: 2025-12-06 10:00:52.297 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:53 localhost podman[197801]: time="2025-12-06T10:00:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:00:53 localhost podman[197801]: @ - - [06/Dec/2025:10:00:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:00:53 localhost podman[197801]: @ - - [06/Dec/2025:10:00:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15463 "" "Go-http-client/1.1" Dec 6 05:00:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57367 DF PROTO=TCP SPT=59152 DPT=9102 SEQ=3395088089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC9B3F90000000001030307) Dec 6 05:00:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57368 DF PROTO=TCP SPT=59152 DPT=9102 SEQ=3395088089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC9B8080000000001030307) Dec 6 05:00:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31359 DF PROTO=TCP SPT=49870 DPT=9102 SEQ=888968206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC9BB870000000001030307) Dec 6 05:00:56 localhost nova_compute[237281]: 2025-12-06 10:00:56.239 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:00:56 localhost podman[240689]: 2025-12-06 10:00:56.537114668 +0000 UTC m=+0.073176848 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:00:56 localhost podman[240689]: 2025-12-06 10:00:56.599703778 +0000 UTC m=+0.135766008 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 6 05:00:56 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:00:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57369 DF PROTO=TCP SPT=59152 DPT=9102 SEQ=3395088089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC9C0070000000001030307) Dec 6 05:00:57 localhost nova_compute[237281]: 2025-12-06 10:00:57.300 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:00:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46945 DF PROTO=TCP SPT=52846 DPT=9102 SEQ=677795422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC9C3870000000001030307) Dec 6 05:00:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:00:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:00:59 localhost podman[240714]: 2025-12-06 10:00:59.552590655 +0000 UTC m=+0.082295718 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 05:00:59 localhost podman[240714]: 2025-12-06 10:00:59.588169963 +0000 UTC m=+0.117874976 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:00:59 localhost podman[240713]: 2025-12-06 10:00:59.60626627 +0000 UTC m=+0.138175711 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:00:59 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:00:59 localhost podman[240713]: 2025-12-06 10:00:59.619249001 +0000 UTC m=+0.151158422 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:00:59 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:01:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57370 DF PROTO=TCP SPT=59152 DPT=9102 SEQ=3395088089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC9CFC70000000001030307) Dec 6 05:01:01 localhost nova_compute[237281]: 2025-12-06 10:01:01.286 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:02 localhost nova_compute[237281]: 2025-12-06 10:01:02.303 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:06 localhost nova_compute[237281]: 2025-12-06 10:01:06.291 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:01:06 localhost podman[240766]: 2025-12-06 10:01:06.544057053 +0000 UTC m=+0.075698714 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:01:06 localhost podman[240767]: 2025-12-06 10:01:06.601802633 +0000 UTC m=+0.130844084 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Dec 6 05:01:06 localhost podman[240767]: 2025-12-06 10:01:06.613110072 +0000 UTC m=+0.142151523 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:01:06 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:01:06 localhost podman[240766]: 2025-12-06 10:01:06.628743024 +0000 UTC m=+0.160384745 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 05:01:06 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:01:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:01:06.681 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:01:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:01:06.681 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:01:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:01:06.682 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:01:07 localhost nova_compute[237281]: 2025-12-06 10:01:07.307 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57371 DF PROTO=TCP SPT=59152 DPT=9102 SEQ=3395088089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DC9EF870000000001030307) Dec 6 05:01:11 localhost nova_compute[237281]: 2025-12-06 10:01:11.304 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:12 localhost nova_compute[237281]: 2025-12-06 10:01:12.309 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:01:14 localhost podman[240804]: 2025-12-06 10:01:14.546683704 +0000 UTC m=+0.075561771 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6) Dec 6 05:01:14 localhost podman[240804]: 2025-12-06 10:01:14.564193054 +0000 UTC m=+0.093071151 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git) Dec 6 05:01:14 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:01:16 localhost openstack_network_exporter[199751]: ERROR 10:01:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:01:16 localhost openstack_network_exporter[199751]: ERROR 10:01:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:01:16 localhost openstack_network_exporter[199751]: ERROR 10:01:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:01:16 localhost openstack_network_exporter[199751]: ERROR 10:01:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:01:16 localhost openstack_network_exporter[199751]: Dec 6 05:01:16 localhost openstack_network_exporter[199751]: ERROR 10:01:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:01:16 localhost openstack_network_exporter[199751]: Dec 6 05:01:16 localhost nova_compute[237281]: 2025-12-06 10:01:16.305 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:01:16 localhost podman[240824]: 2025-12-06 10:01:16.552662328 +0000 UTC m=+0.086191507 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:01:16 localhost podman[240824]: 2025-12-06 10:01:16.563633866 +0000 UTC m=+0.097163085 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:01:16 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:01:17 localhost nova_compute[237281]: 2025-12-06 10:01:17.340 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:21 localhost nova_compute[237281]: 2025-12-06 10:01:21.346 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:22 localhost nova_compute[237281]: 2025-12-06 10:01:22.343 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:23 localhost podman[197801]: time="2025-12-06T10:01:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:01:23 localhost podman[197801]: @ - - [06/Dec/2025:10:01:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:01:23 localhost podman[197801]: @ - - [06/Dec/2025:10:01:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15464 "" "Go-http-client/1.1" Dec 6 05:01:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20598 DF PROTO=TCP SPT=41154 DPT=9102 SEQ=2068201522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA292A0000000001030307) Dec 6 05:01:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20599 DF PROTO=TCP SPT=41154 DPT=9102 SEQ=2068201522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA2D470000000001030307) Dec 6 05:01:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57372 DF PROTO=TCP SPT=59152 DPT=9102 SEQ=3395088089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA2F880000000001030307) Dec 6 05:01:26 localhost nova_compute[237281]: 2025-12-06 10:01:26.385 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20600 DF PROTO=TCP SPT=41154 DPT=9102 SEQ=2068201522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA35470000000001030307) Dec 6 05:01:27 localhost nova_compute[237281]: 2025-12-06 10:01:27.346 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:01:27 localhost systemd[1]: tmp-crun.q1LK1u.mount: Deactivated successfully. Dec 6 05:01:27 localhost podman[240846]: 2025-12-06 10:01:27.547669857 +0000 UTC m=+0.077328345 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:01:27 localhost podman[240846]: 2025-12-06 10:01:27.635955889 +0000 UTC m=+0.165614367 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:01:27 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:01:27 localhost nova_compute[237281]: 2025-12-06 10:01:27.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31360 DF PROTO=TCP SPT=49870 DPT=9102 SEQ=888968206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA39870000000001030307) Dec 6 05:01:28 localhost nova_compute[237281]: 2025-12-06 10:01:28.901 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:28 localhost nova_compute[237281]: 2025-12-06 10:01:28.901 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:01:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:01:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:01:30 localhost systemd[1]: tmp-crun.AmE37w.mount: Deactivated successfully. Dec 6 05:01:30 localhost podman[240872]: 2025-12-06 10:01:30.526668 +0000 UTC m=+0.062698705 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:01:30 localhost podman[240871]: 2025-12-06 10:01:30.537443842 +0000 UTC m=+0.071318500 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:01:30 localhost podman[240871]: 2025-12-06 10:01:30.540743364 +0000 UTC m=+0.074618022 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:01:30 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:01:30 localhost podman[240872]: 2025-12-06 10:01:30.590970923 +0000 UTC m=+0.127001618 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd) Dec 6 05:01:30 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:01:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20601 DF PROTO=TCP SPT=41154 DPT=9102 SEQ=2068201522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA45070000000001030307) Dec 6 05:01:31 localhost nova_compute[237281]: 2025-12-06 10:01:31.412 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:32 localhost nova_compute[237281]: 2025-12-06 10:01:32.349 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:32 localhost nova_compute[237281]: 2025-12-06 10:01:32.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:32 localhost nova_compute[237281]: 2025-12-06 10:01:32.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:32 localhost nova_compute[237281]: 2025-12-06 10:01:32.911 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:32 localhost nova_compute[237281]: 2025-12-06 10:01:32.911 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:33 localhost nova_compute[237281]: 2025-12-06 10:01:33.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:33 localhost nova_compute[237281]: 2025-12-06 10:01:33.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:01:33 localhost nova_compute[237281]: 2025-12-06 10:01:33.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:01:34 localhost nova_compute[237281]: 2025-12-06 10:01:34.115 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:01:34 localhost nova_compute[237281]: 2025-12-06 10:01:34.115 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:01:34 localhost nova_compute[237281]: 2025-12-06 10:01:34.116 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:01:34 localhost nova_compute[237281]: 2025-12-06 10:01:34.116 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.318 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.334 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.334 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.335 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.335 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 05:01:35 localhost nova_compute[237281]: 2025-12-06 10:01:35.905 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 05:01:36 localhost nova_compute[237281]: 2025-12-06 10:01:36.447 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:37 localhost nova_compute[237281]: 2025-12-06 10:01:37.350 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:01:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:01:37 localhost podman[240912]: 2025-12-06 10:01:37.548963387 +0000 UTC m=+0.082503684 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:01:37 localhost podman[240912]: 2025-12-06 10:01:37.564320501 +0000 UTC m=+0.097860778 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:01:37 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:01:37 localhost podman[240911]: 2025-12-06 10:01:37.654998027 +0000 UTC m=+0.190187185 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:01:37 localhost podman[240911]: 2025-12-06 10:01:37.689397078 +0000 UTC m=+0.224586186 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:01:37 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:01:37 localhost nova_compute[237281]: 2025-12-06 10:01:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:37 localhost nova_compute[237281]: 2025-12-06 10:01:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:37 localhost nova_compute[237281]: 2025-12-06 10:01:37.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 05:01:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20602 DF PROTO=TCP SPT=41154 DPT=9102 SEQ=2068201522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA65880000000001030307) Dec 6 05:01:39 localhost nova_compute[237281]: 2025-12-06 10:01:39.942 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:01:39 localhost nova_compute[237281]: 2025-12-06 10:01:39.959 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:01:39 localhost nova_compute[237281]: 2025-12-06 10:01:39.960 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:01:39 localhost nova_compute[237281]: 2025-12-06 10:01:39.961 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:01:39 localhost nova_compute[237281]: 2025-12-06 10:01:39.961 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.041 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.103 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.104 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.142 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.143 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.179 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.180 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.217 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.426 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.428 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12687MB free_disk=387.3106918334961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.429 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.429 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.600 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.601 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.601 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.685 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.782 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.783 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.800 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.825 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.879 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.896 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.898 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:01:40 localhost nova_compute[237281]: 2025-12-06 10:01:40.899 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:01:41 localhost nova_compute[237281]: 2025-12-06 10:01:41.451 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:42 localhost nova_compute[237281]: 2025-12-06 10:01:42.379 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:01:45 localhost systemd[1]: tmp-crun.XbAvQP.mount: Deactivated successfully. Dec 6 05:01:45 localhost podman[240960]: 2025-12-06 10:01:45.567056089 +0000 UTC m=+0.094427583 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 05:01:45 localhost podman[240960]: 2025-12-06 10:01:45.581311068 +0000 UTC m=+0.108682572 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc.) Dec 6 05:01:45 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:01:46 localhost openstack_network_exporter[199751]: ERROR 10:01:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:01:46 localhost openstack_network_exporter[199751]: ERROR 10:01:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:01:46 localhost openstack_network_exporter[199751]: ERROR 10:01:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:01:46 localhost openstack_network_exporter[199751]: ERROR 10:01:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:01:46 localhost openstack_network_exporter[199751]: Dec 6 05:01:46 localhost openstack_network_exporter[199751]: ERROR 10:01:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:01:46 localhost openstack_network_exporter[199751]: Dec 6 05:01:46 localhost nova_compute[237281]: 2025-12-06 10:01:46.473 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:47 localhost nova_compute[237281]: 2025-12-06 10:01:47.382 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:01:47 localhost podman[240980]: 2025-12-06 10:01:47.538729884 +0000 UTC m=+0.075615342 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:01:47 localhost podman[240980]: 2025-12-06 10:01:47.54735345 +0000 UTC m=+0.084238978 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:01:47 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:01:51 localhost nova_compute[237281]: 2025-12-06 10:01:51.476 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:52 localhost nova_compute[237281]: 2025-12-06 10:01:52.383 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:53 localhost podman[197801]: time="2025-12-06T10:01:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:01:53 localhost podman[197801]: @ - - [06/Dec/2025:10:01:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:01:53 localhost podman[197801]: @ - - [06/Dec/2025:10:01:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15458 "" "Go-http-client/1.1" Dec 6 05:01:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10915 DF PROTO=TCP SPT=52886 DPT=9102 SEQ=2671048599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCA9E5A0000000001030307) Dec 6 05:01:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10916 DF PROTO=TCP SPT=52886 DPT=9102 SEQ=2671048599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCAA2470000000001030307) Dec 6 05:01:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20603 DF PROTO=TCP SPT=41154 DPT=9102 SEQ=2068201522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCAA5870000000001030307) Dec 6 05:01:56 localhost nova_compute[237281]: 2025-12-06 10:01:56.518 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10917 DF PROTO=TCP SPT=52886 DPT=9102 SEQ=2671048599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCAAA470000000001030307) Dec 6 05:01:57 localhost nova_compute[237281]: 2025-12-06 10:01:57.386 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:01:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57373 DF PROTO=TCP SPT=59152 DPT=9102 SEQ=3395088089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCAAD870000000001030307) Dec 6 05:01:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:01:58 localhost podman[241004]: 2025-12-06 10:01:58.548818494 +0000 UTC m=+0.081133333 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:01:58 localhost podman[241004]: 2025-12-06 10:01:58.613277701 +0000 UTC m=+0.145592510 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:01:58 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:02:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10918 DF PROTO=TCP SPT=52886 DPT=9102 SEQ=2671048599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCABA080000000001030307) Dec 6 05:02:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:02:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:02:01 localhost nova_compute[237281]: 2025-12-06 10:02:01.522 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:01 localhost podman[241030]: 2025-12-06 10:02:01.546663907 +0000 UTC m=+0.078323686 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:02:01 localhost podman[241031]: 2025-12-06 10:02:01.605075568 +0000 UTC m=+0.134009803 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:02:01 localhost podman[241031]: 2025-12-06 10:02:01.614397015 +0000 UTC m=+0.143331270 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:02:01 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:02:01 localhost podman[241030]: 2025-12-06 10:02:01.633174634 +0000 UTC m=+0.164834423 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:02:01 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:02:02 localhost nova_compute[237281]: 2025-12-06 10:02:02.389 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:06 localhost nova_compute[237281]: 2025-12-06 10:02:06.564 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:02:06.682 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:02:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:02:06.682 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:02:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:02:06.683 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:02:07 localhost nova_compute[237281]: 2025-12-06 10:02:07.078 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:07 localhost nova_compute[237281]: 2025-12-06 10:02:07.390 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:07 localhost nova_compute[237281]: 2025-12-06 10:02:07.612 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Triggering sync for uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 6 05:02:07 localhost nova_compute[237281]: 2025-12-06 10:02:07.613 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:02:07 localhost nova_compute[237281]: 2025-12-06 10:02:07.614 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:02:07 localhost nova_compute[237281]: 2025-12-06 10:02:07.694 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:02:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:02:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:02:08 localhost podman[241072]: 2025-12-06 10:02:08.552019414 +0000 UTC m=+0.080652447 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:02:08 localhost podman[241073]: 2025-12-06 10:02:08.60768696 +0000 UTC m=+0.131850676 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Dec 6 05:02:08 localhost podman[241073]: 2025-12-06 10:02:08.620507956 +0000 UTC m=+0.144671642 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3) Dec 6 05:02:08 localhost podman[241072]: 2025-12-06 10:02:08.633790905 +0000 UTC m=+0.162423938 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Dec 6 05:02:08 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:02:08 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:02:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10919 DF PROTO=TCP SPT=52886 DPT=9102 SEQ=2671048599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCAD9870000000001030307) Dec 6 05:02:11 localhost nova_compute[237281]: 2025-12-06 10:02:11.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:12 localhost nova_compute[237281]: 2025-12-06 10:02:12.392 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:16 localhost openstack_network_exporter[199751]: ERROR 10:02:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:02:16 localhost openstack_network_exporter[199751]: ERROR 10:02:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:02:16 localhost openstack_network_exporter[199751]: ERROR 10:02:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:02:16 localhost openstack_network_exporter[199751]: ERROR 10:02:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:02:16 localhost openstack_network_exporter[199751]: Dec 6 05:02:16 localhost openstack_network_exporter[199751]: ERROR 10:02:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:02:16 localhost openstack_network_exporter[199751]: Dec 6 05:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:02:16 localhost systemd[1]: tmp-crun.ER0X8j.mount: Deactivated successfully. Dec 6 05:02:16 localhost podman[241108]: 2025-12-06 10:02:16.568484433 +0000 UTC m=+0.091566205 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 05:02:16 localhost podman[241108]: 2025-12-06 10:02:16.581167403 +0000 UTC m=+0.104249175 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64) Dec 6 05:02:16 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:02:16 localhost nova_compute[237281]: 2025-12-06 10:02:16.625 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:17 localhost nova_compute[237281]: 2025-12-06 10:02:17.395 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:02:18 localhost podman[241127]: 2025-12-06 10:02:18.549913961 +0000 UTC m=+0.085998902 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:02:18 localhost podman[241127]: 2025-12-06 10:02:18.561528259 +0000 UTC m=+0.097613240 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:02:18 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:02:21 localhost nova_compute[237281]: 2025-12-06 10:02:21.656 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:22 localhost nova_compute[237281]: 2025-12-06 10:02:22.399 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:22.987 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:02:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:22.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.025 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.026 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e81989a-6a33-443f-9f6f-326c0619a0ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:22.988169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8f264d8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': '709d61a9f9ed0b77f4b1c93d978dccea09d312aeb96619ff80e8a612dfb5f610'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:22.988169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8f2739c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': '743159c4a595157266c08039a95ec90a844f8e8a0c3665ccb87abfb73012fe06'}]}, 'timestamp': '2025-12-06 10:02:23.026879', '_unique_id': '59a5f8af6a554aaa9d3ec4c719ed235e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.027 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.031 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79eb710e-5bc7-4369-ac23-5bc98aa1b4c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.028869', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8f3523a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '7035b86f1e255c8315703f3304c4b9bddc7367228c0b2ec4bf4243551e138d70'}]}, 'timestamp': '2025-12-06 10:02:23.032611', '_unique_id': '3d45bbf635a143638c96552dbe852be8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.034 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.035 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.035 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b76aebe-c7fe-430f-b0e3-f6ad3204c312', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.035011', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8f3c062-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': 'f83d7269aa643366a08d397ff72c2552b8cbfc54f597249c47c94470ffa0b26a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.035011', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8f3cc6a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': 'd93dd54b5c1e04fcfb448fe76d6e37e33834a9cc01ab9649408fd210101a528a'}]}, 'timestamp': '2025-12-06 10:02:23.035648', '_unique_id': 'ee602d71b68a4cd69b70ac04a0ab3c87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.036 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.037 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '859e3e2f-0d22-487e-a41b-6d5accb024cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.037602', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8f4253e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '80075b8bac118e8b7c755f2535fc165a774dbe62e019b9579de4eeda26bd5692'}]}, 'timestamp': '2025-12-06 10:02:23.037978', '_unique_id': 'c8b95a3c8a0b4a5787991cb4b097192c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.038 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.039 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5bb494f-03ac-44cd-9edf-0609d880111e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.039759', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8f47a70-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '25581b7da756e260190855a2cfd394c4091453e89d859a355dcb992badf722f5'}]}, 'timestamp': '2025-12-06 10:02:23.040165', '_unique_id': 'b9d5239fcbfe4ce7b7b0163e9628bbbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.040 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.056 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.056 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcea6503-142d-4b12-bb4e-6465c63ca1d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.041698', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8f707a4-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.251805485, 'message_signature': '1fd8838828c39bfd15c36e8790c29047a2edd88b1b956f03f792dcc1b4471e9e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.041698', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8f71c08-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.251805485, 'message_signature': '54786ede6733b412ce76f3a3fe1e7d50dd860ad9e3a43a756facd364197c382d'}]}, 'timestamp': '2025-12-06 10:02:23.057421', '_unique_id': 'eceaa0ac74d642949906f46319edb35b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.058 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.060 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5790b6f9-4a55-4e15-bf6b-fa36f202d26d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.060092', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8f79638-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '715a1961f5636cf63e8c045423817cad771065b7c2709f897a0ca70c08b6672b'}]}, 'timestamp': '2025-12-06 10:02:23.060568', '_unique_id': 'fed2db59411945438cc8892b1d9ced86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.061 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.062 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9951e8aa-a26f-467b-ba39-ba3c98c71118', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:02:23.062743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a8fac484-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.290634182, 'message_signature': 'fde2c7b9de39787a8cbb5f17ef4de104681a8f8435efa010669f3a13b8b4e7b5'}]}, 'timestamp': '2025-12-06 10:02:23.081495', '_unique_id': '7d66d2c3d87c4ea79dd2070c418b44ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.084 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1451e6f2-2419-4d46-98a4-da8576a977cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.084369', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8fb4b98-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '56ef201a3e309bbd97f0a50163588e3e6b07a0fe07d502229b17500aefff27b2'}]}, 'timestamp': '2025-12-06 10:02:23.084910', '_unique_id': 'c9eb8179ae0f4223b43ea35201b79b47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.087 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.087 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.087 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2346bd08-47a8-4161-b3d1-0ed04f5317ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.087187', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8fbb8c6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': '3ac28cefc81c3568a29d0447283dacc1de2bc3911da3628fbba04fa531052da9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.087187', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8fbcab4-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': '3b13f75d4da05423d96f840d32ec286596339ff2e8ea9989d1778d89bf14e42a'}]}, 'timestamp': '2025-12-06 10:02:23.088104', '_unique_id': 'df5d91d3aebc4ad888c74f5823c82a34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '097080a3-3098-4c96-aa73-c8e83cbbe202', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.090281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8fc30a8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': 'd7cec09a558024a7552a0b05572ef56fe0a96a7980ede589b3458a0be35c44bc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.090281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8fc43d6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': 'e1e1aeec3dd8bb0802cbb0f196035b8f751f27135d81ee2b31cb7bfff036e96d'}]}, 'timestamp': '2025-12-06 10:02:23.091230', '_unique_id': 'b70de81d5dec42d787b9701ea428fd93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.093 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd893eca8-dfdc-4d02-b4c3-60aa4eb0b356', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.093508', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8fcaee8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '22a5ebfaed40ae838dfc78131229ef8a42e46c58368b1901027b0dd2e0e6cd53'}]}, 'timestamp': '2025-12-06 10:02:23.093999', '_unique_id': 'ae8f1932e3c54585ab0ad36be62f6414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e97b0af5-7533-4fcd-89a0-896f72b68baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.096105', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8fd1432-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': '1a6afd9fa41ceba0f24fc0ccc07cda77284af0d8e9824d624913b7768f643dac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.096105', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8fd2422-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': '06aba39cda8f5fa4b29aa5a105a35b788750c8a37616f5a7219ecf00dd485e50'}]}, 'timestamp': '2025-12-06 10:02:23.096964', '_unique_id': 'd4b537b44f1b4804a4919cbc41cca885'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.099 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc134462-3421-45db-9968-9a0359dba170', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.099147', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8fd8b24-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': 'fa9975056028c6f03790b8ccecde207abf9915c06aa49363ab5e1d2f2d208f3a'}]}, 'timestamp': '2025-12-06 10:02:23.099600', '_unique_id': '057cbbc1201e46c99225e6b9093d7908'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.101 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e32ffcf-469e-49be-a879-6e470c74f0ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.101671', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8fdeee8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.251805485, 'message_signature': '7c7736451fc38191236a0c7fe5107b929adcd393808d7119939982815da35f2f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.101671', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8fdfeec-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.251805485, 'message_signature': '083964565b0f4e58f32f18a4c0f128cfb5d2fbfac32e10be9a6e197fb50dc5d1'}]}, 'timestamp': '2025-12-06 10:02:23.102538', '_unique_id': '094d59ca92224915ba7bdade701fd16f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.105 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24f2a598-8685-46e8-893b-4bcbd45302e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.104675', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8fe63f0-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': '57cd06f3ffec94c830e54f86e0b934461db406584bdc7866c71ee14d0c4d2127'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.104675', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8fe73e0-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.198261304, 'message_signature': 'ab692315fb3f845ccc5239fe37c391ec60b16eb43209f2b77a08ebe3339973cb'}]}, 'timestamp': '2025-12-06 10:02:23.105530', '_unique_id': 'c587bd897dc948cfb11380d12a2c0e64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.107 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0db3823-c22a-4756-be42-bd945907fb6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.107683', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8fed7b8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': 'b35b057b244f55b3f9451fdc1e494c9a6a31ec2ea3615e74429d3ad4c6b4639e'}]}, 'timestamp': '2025-12-06 10:02:23.108041', '_unique_id': '35f1d99b1ab64e15a8f2c7314797530f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a69bbf4-ba15-4eaf-897a-65b9d247911c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.109534', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8ff1dcc-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '611b42ddcbc2117d0792f1675bbd0eedd1fb9d290feeefc2a7a09df3629368aa'}]}, 'timestamp': '2025-12-06 10:02:23.109832', '_unique_id': '9edb2a168aef4d7a9b5d549bd8f2745b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.111 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.111 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 12980000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2db70281-ed16-4dde-b6df-dd55b39d0dc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12980000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:02:23.111428', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a8ff67e6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.290634182, 'message_signature': 'b930196f0bcd03200f4a3a68c302d8f7cabbfba2eeeef417c4148848e841e7e9'}]}, 'timestamp': '2025-12-06 10:02:23.111722', '_unique_id': '6d3c500a47cc41809f22a9b624dbae97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a98bc6f9-238c-4f97-822a-c935476d1b9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:02:23.113199', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'a8ffad00-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.238983529, 'message_signature': '11f887a38139b886246d818fcc91bc57c1994c4ca44733c88c2b102f0ce5558e'}]}, 'timestamp': '2025-12-06 10:02:23.113498', '_unique_id': 'ad3903dbd2294bc8b9ea30a0acd1693f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.114 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb8b0de3-8b44-477f-b873-ad071c8b1f02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:02:23.114901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8ffef86-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.251805485, 'message_signature': 'd29e17a209c05293d8d6b93e959dff36623a1cce51116dc7688297f9c4031337'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:02:23.114901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8fffa26-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11730.251805485, 'message_signature': '3dca10b8f6b18aef16e1355ccfbdcc08fc4a68723a3777e0c93f863c07a81f57'}]}, 'timestamp': '2025-12-06 10:02:23.115453', '_unique_id': '69021cec733a4958ae6f4fae47d0ed1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:02:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:02:23.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:02:23 localhost podman[197801]: time="2025-12-06T10:02:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:02:23 localhost podman[197801]: @ - - [06/Dec/2025:10:02:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:02:23 localhost podman[197801]: @ - - [06/Dec/2025:10:02:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15459 "" "Go-http-client/1.1" Dec 6 05:02:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21491 DF PROTO=TCP SPT=38448 DPT=9102 SEQ=2666116684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB13890000000001030307) Dec 6 05:02:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21492 DF PROTO=TCP SPT=38448 DPT=9102 SEQ=2666116684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB17880000000001030307) Dec 6 05:02:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10920 DF PROTO=TCP SPT=52886 DPT=9102 SEQ=2671048599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB19870000000001030307) Dec 6 05:02:26 localhost nova_compute[237281]: 2025-12-06 10:02:26.699 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21493 DF PROTO=TCP SPT=38448 DPT=9102 SEQ=2666116684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB1F880000000001030307) Dec 6 05:02:27 localhost nova_compute[237281]: 2025-12-06 10:02:27.401 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20604 DF PROTO=TCP SPT=41154 DPT=9102 SEQ=2068201522 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB23870000000001030307) Dec 6 05:02:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:02:29 localhost podman[241150]: 2025-12-06 10:02:29.544137454 +0000 UTC m=+0.080150862 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 6 05:02:29 localhost podman[241150]: 2025-12-06 10:02:29.580523776 +0000 UTC m=+0.116537194 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3) Dec 6 05:02:29 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:02:30 localhost nova_compute[237281]: 2025-12-06 10:02:30.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:30 localhost nova_compute[237281]: 2025-12-06 10:02:30.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:02:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21494 DF PROTO=TCP SPT=38448 DPT=9102 SEQ=2666116684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB2F480000000001030307) Dec 6 05:02:31 localhost nova_compute[237281]: 2025-12-06 10:02:31.744 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:32 localhost nova_compute[237281]: 2025-12-06 10:02:32.404 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:02:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:02:32 localhost podman[241175]: 2025-12-06 10:02:32.566240086 +0000 UTC m=+0.091010127 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:02:32 localhost podman[241175]: 2025-12-06 10:02:32.571734216 +0000 UTC m=+0.096504257 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:02:32 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:02:32 localhost systemd[1]: tmp-crun.txLQlB.mount: Deactivated successfully. Dec 6 05:02:32 localhost podman[241176]: 2025-12-06 10:02:32.629570448 +0000 UTC m=+0.150411257 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:02:32 localhost podman[241176]: 2025-12-06 10:02:32.64322784 +0000 UTC m=+0.164068689 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:02:32 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:02:32 localhost nova_compute[237281]: 2025-12-06 10:02:32.882 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:32 localhost nova_compute[237281]: 2025-12-06 10:02:32.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:33 localhost nova_compute[237281]: 2025-12-06 10:02:33.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:34 localhost nova_compute[237281]: 2025-12-06 10:02:34.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:34 localhost nova_compute[237281]: 2025-12-06 10:02:34.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:02:34 localhost nova_compute[237281]: 2025-12-06 10:02:34.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:02:35 localhost nova_compute[237281]: 2025-12-06 10:02:35.200 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:02:35 localhost nova_compute[237281]: 2025-12-06 10:02:35.200 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:02:35 localhost nova_compute[237281]: 2025-12-06 10:02:35.201 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:02:35 localhost nova_compute[237281]: 2025-12-06 10:02:35.201 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:02:36 localhost nova_compute[237281]: 2025-12-06 10:02:36.791 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:37 localhost nova_compute[237281]: 2025-12-06 10:02:37.407 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21495 DF PROTO=TCP SPT=38448 DPT=9102 SEQ=2666116684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB4F870000000001030307) Dec 6 05:02:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:02:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:02:39 localhost podman[241219]: 2025-12-06 10:02:39.568056773 +0000 UTC m=+0.088150088 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:02:39 localhost podman[241219]: 2025-12-06 10:02:39.598821322 +0000 UTC m=+0.118914617 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:02:39 localhost podman[241220]: 2025-12-06 10:02:39.612116062 +0000 UTC m=+0.129282467 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:02:39 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:02:39 localhost podman[241220]: 2025-12-06 10:02:39.621044907 +0000 UTC m=+0.138211332 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 05:02:39 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:02:39 localhost nova_compute[237281]: 2025-12-06 10:02:39.746 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.359 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.360 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.361 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.361 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.362 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.911 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.911 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.912 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.912 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:02:40 localhost nova_compute[237281]: 2025-12-06 10:02:40.988 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.063 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.064 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.120 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.121 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.196 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.197 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.272 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.505 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.507 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12680MB free_disk=387.3106918334961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.508 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.508 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.601 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.601 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.602 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.670 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.684 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.687 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.687 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:02:41 localhost nova_compute[237281]: 2025-12-06 10:02:41.838 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:42 localhost nova_compute[237281]: 2025-12-06 10:02:42.409 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:44 localhost sshd[241265]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:02:46 localhost openstack_network_exporter[199751]: ERROR 10:02:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:02:46 localhost openstack_network_exporter[199751]: ERROR 10:02:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:02:46 localhost openstack_network_exporter[199751]: ERROR 10:02:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:02:46 localhost openstack_network_exporter[199751]: ERROR 10:02:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:02:46 localhost openstack_network_exporter[199751]: Dec 6 05:02:46 localhost openstack_network_exporter[199751]: ERROR 10:02:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:02:46 localhost openstack_network_exporter[199751]: Dec 6 05:02:46 localhost nova_compute[237281]: 2025-12-06 10:02:46.884 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:47 localhost nova_compute[237281]: 2025-12-06 10:02:47.411 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:02:47 localhost systemd[1]: tmp-crun.B2dlDT.mount: Deactivated successfully. Dec 6 05:02:47 localhost podman[241267]: 2025-12-06 10:02:47.547804569 +0000 UTC m=+0.084690793 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible) Dec 6 05:02:47 localhost podman[241267]: 2025-12-06 10:02:47.58613479 +0000 UTC m=+0.123021014 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41) Dec 6 05:02:47 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:02:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:02:49 localhost systemd[1]: tmp-crun.kFVdEw.mount: Deactivated successfully. Dec 6 05:02:49 localhost podman[241287]: 2025-12-06 10:02:49.599716629 +0000 UTC m=+0.130084561 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:02:49 localhost podman[241287]: 2025-12-06 10:02:49.609139709 +0000 UTC m=+0.139507591 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:02:49 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:02:51 localhost nova_compute[237281]: 2025-12-06 10:02:51.919 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:52 localhost nova_compute[237281]: 2025-12-06 10:02:52.413 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:53 localhost podman[197801]: time="2025-12-06T10:02:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:02:53 localhost podman[197801]: @ - - [06/Dec/2025:10:02:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:02:53 localhost podman[197801]: @ - - [06/Dec/2025:10:02:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15458 "" "Go-http-client/1.1" Dec 6 05:02:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3412 DF PROTO=TCP SPT=50086 DPT=9102 SEQ=1307104248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB88B90000000001030307) Dec 6 05:02:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3413 DF PROTO=TCP SPT=50086 DPT=9102 SEQ=1307104248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB8CC80000000001030307) Dec 6 05:02:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21496 DF PROTO=TCP SPT=38448 DPT=9102 SEQ=2666116684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB8F880000000001030307) Dec 6 05:02:56 localhost nova_compute[237281]: 2025-12-06 10:02:56.953 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3414 DF PROTO=TCP SPT=50086 DPT=9102 SEQ=1307104248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB94C70000000001030307) Dec 6 05:02:57 localhost nova_compute[237281]: 2025-12-06 10:02:57.414 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:02:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10921 DF PROTO=TCP SPT=52886 DPT=9102 SEQ=2671048599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCB97870000000001030307) Dec 6 05:03:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:03:00 localhost podman[241310]: 2025-12-06 10:03:00.551471973 +0000 UTC m=+0.088197600 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Dec 6 05:03:00 localhost podman[241310]: 2025-12-06 10:03:00.582251212 +0000 UTC m=+0.118976879 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:03:00 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:03:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3415 DF PROTO=TCP SPT=50086 DPT=9102 SEQ=1307104248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCBA4870000000001030307) Dec 6 05:03:02 localhost nova_compute[237281]: 2025-12-06 10:03:02.004 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:02 localhost nova_compute[237281]: 2025-12-06 10:03:02.416 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:03:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:03:03 localhost podman[241335]: 2025-12-06 10:03:03.241532518 +0000 UTC m=+0.084118434 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:03:03 localhost podman[241335]: 2025-12-06 10:03:03.247281336 +0000 UTC m=+0.089867322 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:03:03 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:03:03 localhost systemd[1]: tmp-crun.mnFbi5.mount: Deactivated successfully. Dec 6 05:03:03 localhost podman[241336]: 2025-12-06 10:03:03.300813656 +0000 UTC m=+0.139818422 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:03:03 localhost podman[241336]: 2025-12-06 10:03:03.314311162 +0000 UTC m=+0.153315948 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:03:03 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:03:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:03:06.682 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:03:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:03:06.683 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:03:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:03:06.684 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:03:07 localhost nova_compute[237281]: 2025-12-06 10:03:07.034 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:07 localhost nova_compute[237281]: 2025-12-06 10:03:07.418 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3416 DF PROTO=TCP SPT=50086 DPT=9102 SEQ=1307104248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCBC5880000000001030307) Dec 6 05:03:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:03:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:03:10 localhost podman[241373]: 2025-12-06 10:03:10.547835061 +0000 UTC m=+0.083541527 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 05:03:10 localhost podman[241373]: 2025-12-06 10:03:10.556418365 +0000 UTC m=+0.092124841 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Dec 6 05:03:10 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:03:10 localhost podman[241374]: 2025-12-06 10:03:10.599932216 +0000 UTC m=+0.131250047 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:03:10 localhost podman[241374]: 2025-12-06 10:03:10.613314079 +0000 UTC m=+0.144631990 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:03:10 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:03:12 localhost nova_compute[237281]: 2025-12-06 10:03:12.084 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:12 localhost nova_compute[237281]: 2025-12-06 10:03:12.421 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:16 localhost openstack_network_exporter[199751]: ERROR 10:03:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:03:16 localhost openstack_network_exporter[199751]: ERROR 10:03:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:03:16 localhost openstack_network_exporter[199751]: ERROR 10:03:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:03:16 localhost openstack_network_exporter[199751]: ERROR 10:03:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:03:16 localhost openstack_network_exporter[199751]: Dec 6 05:03:16 localhost openstack_network_exporter[199751]: ERROR 10:03:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:03:16 localhost openstack_network_exporter[199751]: Dec 6 05:03:17 localhost nova_compute[237281]: 2025-12-06 10:03:17.087 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:17 localhost nova_compute[237281]: 2025-12-06 10:03:17.423 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:03:18 localhost podman[241410]: 2025-12-06 10:03:18.530858385 +0000 UTC m=+0.067498742 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter) Dec 6 05:03:18 localhost podman[241410]: 2025-12-06 10:03:18.541458912 +0000 UTC m=+0.078099289 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9) Dec 6 05:03:18 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:03:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:03:20 localhost podman[241431]: 2025-12-06 10:03:20.550506821 +0000 UTC m=+0.083005469 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:03:20 localhost podman[241431]: 2025-12-06 10:03:20.586284624 +0000 UTC m=+0.118783232 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:03:20 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:03:22 localhost nova_compute[237281]: 2025-12-06 10:03:22.090 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:22 localhost nova_compute[237281]: 2025-12-06 10:03:22.426 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:23 localhost podman[197801]: time="2025-12-06T10:03:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:03:23 localhost podman[197801]: @ - - [06/Dec/2025:10:03:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:03:23 localhost podman[197801]: @ - - [06/Dec/2025:10:03:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15462 "" "Go-http-client/1.1" Dec 6 05:03:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4649 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=3939123020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCBFDEA0000000001030307) Dec 6 05:03:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4650 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=3939123020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC02070000000001030307) Dec 6 05:03:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3417 DF PROTO=TCP SPT=50086 DPT=9102 SEQ=1307104248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC05870000000001030307) Dec 6 05:03:27 localhost nova_compute[237281]: 2025-12-06 10:03:27.094 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4651 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=3939123020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC0A070000000001030307) Dec 6 05:03:27 localhost nova_compute[237281]: 2025-12-06 10:03:27.428 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21497 DF PROTO=TCP SPT=38448 DPT=9102 SEQ=2666116684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC0D880000000001030307) Dec 6 05:03:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4652 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=3939123020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC19C70000000001030307) Dec 6 05:03:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:03:31 localhost podman[241454]: 2025-12-06 10:03:31.546238948 +0000 UTC m=+0.079810071 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true) Dec 6 05:03:31 localhost podman[241454]: 2025-12-06 10:03:31.582173946 +0000 UTC m=+0.115745059 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller) Dec 6 05:03:31 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:03:32 localhost nova_compute[237281]: 2025-12-06 10:03:32.096 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:32 localhost nova_compute[237281]: 2025-12-06 10:03:32.430 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:03:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:03:33 localhost podman[241479]: 2025-12-06 10:03:33.559296231 +0000 UTC m=+0.090585384 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:03:33 localhost podman[241480]: 2025-12-06 10:03:33.602163302 +0000 UTC m=+0.129626667 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:03:33 localhost podman[241479]: 2025-12-06 10:03:33.622615323 +0000 UTC m=+0.153904516 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:03:33 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:03:33 localhost podman[241480]: 2025-12-06 10:03:33.642380003 +0000 UTC m=+0.169843438 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible) Dec 6 05:03:33 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:03:33 localhost nova_compute[237281]: 2025-12-06 10:03:33.685 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:33 localhost nova_compute[237281]: 2025-12-06 10:03:33.685 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:33 localhost nova_compute[237281]: 2025-12-06 10:03:33.686 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:33 localhost nova_compute[237281]: 2025-12-06 10:03:33.686 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:03:33 localhost nova_compute[237281]: 2025-12-06 10:03:33.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:34 localhost nova_compute[237281]: 2025-12-06 10:03:34.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:34 localhost nova_compute[237281]: 2025-12-06 10:03:34.926 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:34 localhost nova_compute[237281]: 2025-12-06 10:03:34.927 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:03:34 localhost nova_compute[237281]: 2025-12-06 10:03:34.927 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:03:35 localhost nova_compute[237281]: 2025-12-06 10:03:35.138 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:03:35 localhost nova_compute[237281]: 2025-12-06 10:03:35.138 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:03:35 localhost nova_compute[237281]: 2025-12-06 10:03:35.138 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:03:35 localhost nova_compute[237281]: 2025-12-06 10:03:35.139 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:03:37 localhost nova_compute[237281]: 2025-12-06 10:03:37.046 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:03:37 localhost nova_compute[237281]: 2025-12-06 10:03:37.061 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:03:37 localhost nova_compute[237281]: 2025-12-06 10:03:37.062 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:03:37 localhost nova_compute[237281]: 2025-12-06 10:03:37.063 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:37 localhost nova_compute[237281]: 2025-12-06 10:03:37.063 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:37 localhost nova_compute[237281]: 2025-12-06 10:03:37.099 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:37 localhost nova_compute[237281]: 2025-12-06 10:03:37.509 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4653 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=3939123020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC39880000000001030307) Dec 6 05:03:39 localhost nova_compute[237281]: 2025-12-06 10:03:39.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:40 localhost nova_compute[237281]: 2025-12-06 10:03:40.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:03:40 localhost nova_compute[237281]: 2025-12-06 10:03:40.906 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:03:40 localhost nova_compute[237281]: 2025-12-06 10:03:40.907 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:03:40 localhost nova_compute[237281]: 2025-12-06 10:03:40.907 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:03:40 localhost nova_compute[237281]: 2025-12-06 10:03:40.907 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:03:40 localhost nova_compute[237281]: 2025-12-06 10:03:40.980 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.053 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.054 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.106 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.107 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.154 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.155 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.197 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.412 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.414 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12668MB free_disk=387.3106918334961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.415 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.415 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:03:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:03:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.499 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.500 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.500 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.544 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:03:41 localhost podman[241534]: 2025-12-06 10:03:41.564926743 +0000 UTC m=+0.092507183 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125) Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.562 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.565 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:03:41 localhost nova_compute[237281]: 2025-12-06 10:03:41.566 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:03:41 localhost podman[241534]: 2025-12-06 10:03:41.580463731 +0000 UTC m=+0.108044131 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:03:41 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:03:41 localhost podman[241533]: 2025-12-06 10:03:41.666934377 +0000 UTC m=+0.194285700 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Dec 6 05:03:41 localhost podman[241533]: 2025-12-06 10:03:41.673085167 +0000 UTC m=+0.200436510 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Dec 6 05:03:41 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:03:42 localhost nova_compute[237281]: 2025-12-06 10:03:42.101 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:42 localhost nova_compute[237281]: 2025-12-06 10:03:42.512 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:46 localhost openstack_network_exporter[199751]: ERROR 10:03:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:03:46 localhost openstack_network_exporter[199751]: ERROR 10:03:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:03:46 localhost openstack_network_exporter[199751]: ERROR 10:03:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:03:46 localhost openstack_network_exporter[199751]: ERROR 10:03:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:03:46 localhost openstack_network_exporter[199751]: Dec 6 05:03:46 localhost openstack_network_exporter[199751]: ERROR 10:03:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:03:46 localhost openstack_network_exporter[199751]: Dec 6 05:03:47 localhost nova_compute[237281]: 2025-12-06 10:03:47.103 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:47 localhost nova_compute[237281]: 2025-12-06 10:03:47.534 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:03:49 localhost podman[241573]: 2025-12-06 10:03:49.548487745 +0000 UTC m=+0.084973892 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 05:03:49 localhost podman[241573]: 2025-12-06 10:03:49.55968562 +0000 UTC m=+0.096171797 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7) Dec 6 05:03:49 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:03:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:03:51 localhost systemd[1]: tmp-crun.GZTXqt.mount: Deactivated successfully. Dec 6 05:03:51 localhost podman[241593]: 2025-12-06 10:03:51.544993727 +0000 UTC m=+0.079031377 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:03:51 localhost podman[241593]: 2025-12-06 10:03:51.576015994 +0000 UTC m=+0.110053674 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:03:51 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:03:52 localhost nova_compute[237281]: 2025-12-06 10:03:52.107 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:52 localhost nova_compute[237281]: 2025-12-06 10:03:52.592 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:53 localhost podman[197801]: time="2025-12-06T10:03:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:03:53 localhost podman[197801]: @ - - [06/Dec/2025:10:03:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:03:53 localhost podman[197801]: @ - - [06/Dec/2025:10:03:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15467 "" "Go-http-client/1.1" Dec 6 05:03:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33402 DF PROTO=TCP SPT=54688 DPT=9102 SEQ=147392630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC731B0000000001030307) Dec 6 05:03:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33403 DF PROTO=TCP SPT=54688 DPT=9102 SEQ=147392630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC77070000000001030307) Dec 6 05:03:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4654 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=3939123020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC79880000000001030307) Dec 6 05:03:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33404 DF PROTO=TCP SPT=54688 DPT=9102 SEQ=147392630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC7F070000000001030307) Dec 6 05:03:57 localhost nova_compute[237281]: 2025-12-06 10:03:57.112 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:57 localhost nova_compute[237281]: 2025-12-06 10:03:57.591 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:03:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3418 DF PROTO=TCP SPT=50086 DPT=9102 SEQ=1307104248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC83870000000001030307) Dec 6 05:04:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33405 DF PROTO=TCP SPT=54688 DPT=9102 SEQ=147392630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCC8EC70000000001030307) Dec 6 05:04:02 localhost nova_compute[237281]: 2025-12-06 10:04:02.114 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:04:02 localhost podman[241615]: 2025-12-06 10:04:02.557196146 +0000 UTC m=+0.087710925 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:04:02 localhost nova_compute[237281]: 2025-12-06 10:04:02.626 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:02 localhost podman[241615]: 2025-12-06 10:04:02.644624241 +0000 UTC m=+0.175139080 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:04:02 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:04:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:04:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:04:04 localhost podman[241641]: 2025-12-06 10:04:04.556312379 +0000 UTC m=+0.086480457 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:04:04 localhost podman[241641]: 2025-12-06 10:04:04.56249815 +0000 UTC m=+0.092666228 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:04:04 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:04:04 localhost podman[241642]: 2025-12-06 10:04:04.654446894 +0000 UTC m=+0.180985331 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd) Dec 6 05:04:04 localhost podman[241642]: 2025-12-06 10:04:04.690194727 +0000 UTC m=+0.216733134 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2) Dec 6 05:04:04 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:04:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:04:06.684 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:04:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:04:06.685 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:04:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:04:06.686 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:04:07 localhost nova_compute[237281]: 2025-12-06 10:04:07.118 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:07 localhost nova_compute[237281]: 2025-12-06 10:04:07.663 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33406 DF PROTO=TCP SPT=54688 DPT=9102 SEQ=147392630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCCAF880000000001030307) Dec 6 05:04:12 localhost nova_compute[237281]: 2025-12-06 10:04:12.121 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:04:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:04:12 localhost podman[241682]: 2025-12-06 10:04:12.557396122 +0000 UTC m=+0.087000753 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:04:12 localhost podman[241682]: 2025-12-06 10:04:12.568293668 +0000 UTC m=+0.097898299 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:04:12 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:04:12 localhost systemd[1]: tmp-crun.X1nGf4.mount: Deactivated successfully. Dec 6 05:04:12 localhost podman[241683]: 2025-12-06 10:04:12.622837579 +0000 UTC m=+0.147859259 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:04:12 localhost podman[241683]: 2025-12-06 10:04:12.634946563 +0000 UTC m=+0.159968213 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:04:12 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:04:12 localhost nova_compute[237281]: 2025-12-06 10:04:12.698 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:16 localhost openstack_network_exporter[199751]: ERROR 10:04:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:04:16 localhost openstack_network_exporter[199751]: ERROR 10:04:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:04:16 localhost openstack_network_exporter[199751]: ERROR 10:04:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:04:16 localhost openstack_network_exporter[199751]: ERROR 10:04:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:04:16 localhost openstack_network_exporter[199751]: Dec 6 05:04:16 localhost openstack_network_exporter[199751]: ERROR 10:04:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:04:16 localhost openstack_network_exporter[199751]: Dec 6 05:04:17 localhost nova_compute[237281]: 2025-12-06 10:04:17.124 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:17 localhost nova_compute[237281]: 2025-12-06 10:04:17.744 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:04:20 localhost podman[241718]: 2025-12-06 10:04:20.544544355 +0000 UTC m=+0.081398290 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, distribution-scope=public, managed_by=edpm_ansible) Dec 6 05:04:20 localhost podman[241718]: 2025-12-06 10:04:20.554067538 +0000 UTC m=+0.090921493 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 05:04:20 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:04:22 localhost nova_compute[237281]: 2025-12-06 10:04:22.128 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:04:22 localhost systemd[1]: tmp-crun.g1EdRq.mount: Deactivated successfully. Dec 6 05:04:22 localhost podman[241739]: 2025-12-06 10:04:22.553768779 +0000 UTC m=+0.087495859 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:04:22 localhost podman[241739]: 2025-12-06 10:04:22.560256969 +0000 UTC m=+0.093984049 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:04:22 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:04:22 localhost nova_compute[237281]: 2025-12-06 10:04:22.783 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:22.987 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:04:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:22.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.022 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.023 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56ac9da2-78c4-40b7-8d3f-d69927437be3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:22.988824', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f07882e2-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': 'f39b4f1431073b33c3b70a7658cbc2001c21e46d6ce2fd4a4aec261d9f12f123'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:22.988824', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f078982c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': '35de0ee16e302e8b63a0772680d476f3f6b40861cb72ff1f7f5ca78a143223ef'}]}, 'timestamp': '2025-12-06 10:04:23.024227', '_unique_id': 'f5e9f2c19b844840b6110df18f557330'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.025 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.027 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.027 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9513daa-3ce4-4a0a-a8a9-d1b5cceef8bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.027002', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f07914fa-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': '9894c5ee2b71713727962a9a5e936dabd713081510d7151aaf755ed20f6c69dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.027002', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f079251c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': 'e86e74b212de343782b60166bb60c917b2de9ac4853bf7634d2ebacd58d326d8'}]}, 'timestamp': '2025-12-06 10:04:23.027871', '_unique_id': '47722021444c437d905cce52795629f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.029 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.033 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45c2e791-551a-4aa5-88d5-1fe7c3c55e7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.030083', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f07a2142-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': '7cd446bb738262e647e798dde8b0d9760be1266fa4bbead5ed5bfbfcb9e14fe7'}]}, 'timestamp': '2025-12-06 10:04:23.034343', '_unique_id': 'b53b5a86d2bc4c6da60993366cba620e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.035 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.036 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ca8017a-1e89-492b-b2c3-6657e6f061cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.036609', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f07a8c0e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': 'df6bf972e6029ca9181e8a40ff5e79740d140f14be1eca7ff6c8629d259f44b3'}]}, 'timestamp': '2025-12-06 10:04:23.037081', '_unique_id': '162b91d2c3954a2ba5c31420ef8dd32c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.037 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.039 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44a3d523-be4a-4ed2-aa61-6c7a59b44aa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.039094', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f07aecc6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': 'eef27dbd43107116b96e02a62cfdb74f810eec4df78447043f59cc115af5cac5'}]}, 'timestamp': '2025-12-06 10:04:23.039519', '_unique_id': '5384e8596b00496c8d8ac240cdc3e484'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.040 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.041 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.041 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '328b997c-72db-4640-9078-b32ad6deb679', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.041398', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f07b45f4-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': '57c3f3f27d506f6d7bb7b7f44a3df182d795567e1014bbb43a47b8f609d860dc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.041398', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f07b55c6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': '458d0ebe814e5e1d54445dda0a65c3459f2f139882c4146f94c343ac3851d54c'}]}, 'timestamp': '2025-12-06 10:04:23.042194', '_unique_id': '27fc9c304998450093e9fd194bad8992'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.043 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.044 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.044 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa8793d2-f7c6-44f1-bf9c-ac9feb342159', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.044288', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f07bb778-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': 'd4a1ce3442e69a1ca605a527ad6d742907ed6c45d562c13de0d9d281ac016c63'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.044288', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f07bc5f6-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': 'e083934f20a33a2839814fbe1bbc911cf05db61fd3be5cc404bc628f432db9be'}]}, 'timestamp': '2025-12-06 10:04:23.045001', '_unique_id': 'e5310925cc9848b682f87a7c3dc0709a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.046 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.061 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37d038b8-eaf9-44b7-8e92-f67feb4aa21a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:04:23.046703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f07e57b2-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.271347065, 'message_signature': 'a9544332c5b71eb68ee92876267a372f0a68cd8bb9e28037f8a8d63e9a760996'}]}, 'timestamp': '2025-12-06 10:04:23.061891', '_unique_id': '8be1c276fd644207afb534792ea56f49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.063 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.063 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e797507-2763-4d21-a6ed-3bbc2d29929a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.063572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f07ea53c-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': '78934f266ea5af7c4551bfbad36b64976591aa583d6d4e9bb056ea28ceaa8b63'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.063572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f07eadde-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': 'd9ebf52c073507efcd0262a700ed222499acac38d5a8c9935faa6a3e97dbb38d'}]}, 'timestamp': '2025-12-06 10:04:23.064032', '_unique_id': '719c27d9fe534526a8ae215bd7efc8a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.064 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.065 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.078 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22ce8daf-4e04-4a4e-a2d1-720ac6aced0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.065101', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f08101ba-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.275189524, 'message_signature': 'e8fe30a45ceab997244a42e4cf80a601151d2846c82ecf857d83ab0f9e2f529a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.065101', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0810ffc-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.275189524, 'message_signature': '5d5ac192624a753a752266c8bbad23ae57d5343aae6a1ae012f8801b852cfc7b'}]}, 'timestamp': '2025-12-06 10:04:23.079677', '_unique_id': '84a957ad2e334772a883c43cb15d2a78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.081 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19b56e25-a89a-4588-b3c5-0b84985378d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.081824', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f0817172-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': '4ee09f0c779eb914211f819ccb79666f0d896b53aa766ad5b66d5e0937a9df2a'}]}, 'timestamp': '2025-12-06 10:04:23.082223', '_unique_id': 'b4677b60db35405e9dac7248413c3669'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.082 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.083 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 13600000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72ca8177-fe14-477c-a554-c613c3dcd492', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13600000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:04:23.083717', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f081ba7e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.271347065, 'message_signature': '499df460843446c6b0116002ac7a32c0c3946d62bffaf4d1446fe85a0e8aaadb'}]}, 'timestamp': '2025-12-06 10:04:23.084064', '_unique_id': '2d2604e959b2463488b4673307c17e85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90d0da62-1295-4e92-9a8a-75074be906ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.085556', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f08201be-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': '33d8072a54cf1628d3ffaa2c18f69d33cfe8d559e536e43612530fd2fcb33dfa'}]}, 'timestamp': '2025-12-06 10:04:23.085888', '_unique_id': '50c099a1391946ec87821df9ecffa0da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.087 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0408bf44-cecf-4c2f-bd13-895b231f8ea3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.087321', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f082466a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': 'baa363ebde041de511d446d3ab08fc09aaab14fa51923da4ad92a2342b22ed98'}]}, 'timestamp': '2025-12-06 10:04:23.087628', '_unique_id': '4cdb02fea0e046e3ba548e42f9aae114'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '079513f1-4e5c-405d-9fdc-cc620093df9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.089125', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f0828d1e-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': '0c7868ab7ca686c9a928603f9d304d90e03302e13375151e646e80d446b803a0'}]}, 'timestamp': '2025-12-06 10:04:23.089437', '_unique_id': 'bc8753ee06ae401cabe1e17a242db319'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e83fe73-5714-49b9-a501-a590a6bc4892', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.090826', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f082d044-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': '147b4d2c27ebeea080b079a1b3b3c393a27dfc9efda729653948ec038f0e4f4d'}]}, 'timestamp': '2025-12-06 10:04:23.091172', '_unique_id': 'd92d3a7579984fc483a13bd9c3028c5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.092 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.093 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba38420d-a72e-40c3-8617-915689ec47cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.093021', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f083262a-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': '9d00b933953ad613bf6710a482dc88ae76c87cd483bb6f2300dd95617fac34ed'}]}, 'timestamp': '2025-12-06 10:04:23.093400', '_unique_id': '25a610ab7df34b9d8f4651bc825ddeb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.094 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.095 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3eba6494-5453-4bed-b861-13ea91a2c4b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:04:23.095331', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'f08381d8-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.240206695, 'message_signature': 'a80ed1d7d28e7734eccaecbb9d09374925a166c7bd896f3872431aee920e4e15'}]}, 'timestamp': '2025-12-06 10:04:23.095774', '_unique_id': '811f2625259f43ae9c6695ba4d0e86d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.096 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.098 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.098 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0ba8a05-4f1a-4454-bccb-f47622602689', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.098152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f083ef06-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.275189524, 'message_signature': '1db6af329969a8d23a8ad3d5092e30eb23f5fdba4ebe3cc4c838d72484b3e385'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.098152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f083fa82-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.275189524, 'message_signature': '0367f3bea57b33d7d2d7a0b48c5651ad6fddca425d880e26fb722dea11549cd9'}]}, 'timestamp': '2025-12-06 10:04:23.098770', '_unique_id': 'a2a9a2d9c25e4af4afafeb52aeb8c94c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82b62c28-22d4-4171-88ed-6ebfcc941490', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.100461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f08447e4-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': '4c5f4bde2d13f9b0bc271628ab71d92176ef27ea96a6111e24e4d15ec7cb0010'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.100461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0845360-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.198931564, 'message_signature': '3c3e8b31a2b4dd48f4021466760c7940da8aaf2374366ecc919ad87a6733d64d'}]}, 'timestamp': '2025-12-06 10:04:23.101045', '_unique_id': '40ae2e4a54de4befaf2459398533fe93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a21287ca-ddec-4a2d-9d8f-007d0b32c91e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:04:23.102518', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f08498a2-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.275189524, 'message_signature': '82ef880ee9cce7124719ae251a468ee59bc1833c1fc155c680f6b8a93d50c84c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:04:23.102518', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f084a4a0-d28a-11f0-8fed-fa163edf398d', 'monotonic_time': 11850.275189524, 'message_signature': '27a21c935ef0f9526de44cb70d1464076192191754d889c7c1b70d3d175853b5'}]}, 'timestamp': '2025-12-06 10:04:23.103126', '_unique_id': 'a7fe2c645d3b4cba91d023bed0fccc1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:04:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:04:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:04:23 localhost podman[197801]: time="2025-12-06T10:04:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:04:23 localhost podman[197801]: @ - - [06/Dec/2025:10:04:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:04:23 localhost podman[197801]: @ - - [06/Dec/2025:10:04:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15462 "" "Go-http-client/1.1" Dec 6 05:04:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29349 DF PROTO=TCP SPT=43734 DPT=9102 SEQ=561670560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCCE84A0000000001030307) Dec 6 05:04:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29350 DF PROTO=TCP SPT=43734 DPT=9102 SEQ=561670560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCCEC470000000001030307) Dec 6 05:04:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33407 DF PROTO=TCP SPT=54688 DPT=9102 SEQ=147392630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCCEF880000000001030307) Dec 6 05:04:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29351 DF PROTO=TCP SPT=43734 DPT=9102 SEQ=561670560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCCF4470000000001030307) Dec 6 05:04:27 localhost nova_compute[237281]: 2025-12-06 10:04:27.130 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:27 localhost nova_compute[237281]: 2025-12-06 10:04:27.787 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4655 DF PROTO=TCP SPT=60440 DPT=9102 SEQ=3939123020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCCF7870000000001030307) Dec 6 05:04:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29352 DF PROTO=TCP SPT=43734 DPT=9102 SEQ=561670560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD04070000000001030307) Dec 6 05:04:32 localhost nova_compute[237281]: 2025-12-06 10:04:32.133 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:32 localhost nova_compute[237281]: 2025-12-06 10:04:32.828 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:04:33 localhost podman[241761]: 2025-12-06 10:04:33.252506301 +0000 UTC m=+0.084365311 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller) Dec 6 05:04:33 localhost podman[241761]: 2025-12-06 10:04:33.289545874 +0000 UTC m=+0.121404874 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 05:04:33 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:04:33 localhost nova_compute[237281]: 2025-12-06 10:04:33.568 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:34 localhost nova_compute[237281]: 2025-12-06 10:04:34.880 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:34 localhost nova_compute[237281]: 2025-12-06 10:04:34.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:34 localhost nova_compute[237281]: 2025-12-06 10:04:34.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:04:34 localhost nova_compute[237281]: 2025-12-06 10:04:34.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:04:35 localhost nova_compute[237281]: 2025-12-06 10:04:35.197 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:04:35 localhost nova_compute[237281]: 2025-12-06 10:04:35.198 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:04:35 localhost nova_compute[237281]: 2025-12-06 10:04:35.199 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:04:35 localhost nova_compute[237281]: 2025-12-06 10:04:35.199 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:04:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:04:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:04:35 localhost podman[241787]: 2025-12-06 10:04:35.608569519 +0000 UTC m=+0.079482181 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:04:35 localhost podman[241787]: 2025-12-06 10:04:35.620162358 +0000 UTC m=+0.091075090 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:04:35 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:04:35 localhost podman[241788]: 2025-12-06 10:04:35.670541911 +0000 UTC m=+0.140377770 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3) Dec 6 05:04:35 localhost podman[241788]: 2025-12-06 10:04:35.685145421 +0000 UTC m=+0.154981240 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible) Dec 6 05:04:35 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:04:36 localhost nova_compute[237281]: 2025-12-06 10:04:36.464 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:04:36 localhost nova_compute[237281]: 2025-12-06 10:04:36.482 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:04:36 localhost nova_compute[237281]: 2025-12-06 10:04:36.483 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:04:36 localhost nova_compute[237281]: 2025-12-06 10:04:36.484 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:36 localhost nova_compute[237281]: 2025-12-06 10:04:36.485 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:36 localhost nova_compute[237281]: 2025-12-06 10:04:36.485 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:04:36 localhost nova_compute[237281]: 2025-12-06 10:04:36.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:37 localhost nova_compute[237281]: 2025-12-06 10:04:37.136 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:37 localhost nova_compute[237281]: 2025-12-06 10:04:37.830 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:37 localhost nova_compute[237281]: 2025-12-06 10:04:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29353 DF PROTO=TCP SPT=43734 DPT=9102 SEQ=561670560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD23870000000001030307) Dec 6 05:04:40 localhost nova_compute[237281]: 2025-12-06 10:04:40.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:41 localhost nova_compute[237281]: 2025-12-06 10:04:41.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.038 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.039 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.039 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.040 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.103 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.139 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.177 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.178 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.255 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.256 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.328 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.329 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.371 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.559 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.560 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12684MB free_disk=387.3106918334961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.561 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.561 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.784 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.785 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.786 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.833 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.859 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.861 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.861 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:04:42 localhost nova_compute[237281]: 2025-12-06 10:04:42.871 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:04:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:04:43 localhost podman[241840]: 2025-12-06 10:04:43.55444826 +0000 UTC m=+0.081613167 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:04:43 localhost podman[241840]: 2025-12-06 10:04:43.561536518 +0000 UTC m=+0.088701465 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 05:04:43 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:04:43 localhost podman[241841]: 2025-12-06 10:04:43.615083749 +0000 UTC m=+0.138790529 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 05:04:43 localhost podman[241841]: 2025-12-06 10:04:43.656322511 +0000 UTC m=+0.180029271 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Dec 6 05:04:43 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:04:46 localhost openstack_network_exporter[199751]: ERROR 10:04:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:04:46 localhost openstack_network_exporter[199751]: ERROR 10:04:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:04:46 localhost openstack_network_exporter[199751]: ERROR 10:04:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:04:46 localhost openstack_network_exporter[199751]: Dec 6 05:04:46 localhost openstack_network_exporter[199751]: ERROR 10:04:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:04:46 localhost openstack_network_exporter[199751]: ERROR 10:04:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:04:46 localhost openstack_network_exporter[199751]: Dec 6 05:04:46 localhost sshd[241872]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:04:47 localhost nova_compute[237281]: 2025-12-06 10:04:47.142 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:47 localhost nova_compute[237281]: 2025-12-06 10:04:47.875 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:04:51 localhost podman[241874]: 2025-12-06 10:04:51.558503003 +0000 UTC m=+0.085344371 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:04:51 localhost podman[241874]: 2025-12-06 10:04:51.572063718 +0000 UTC m=+0.098905116 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.) Dec 6 05:04:51 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:04:52 localhost nova_compute[237281]: 2025-12-06 10:04:52.145 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:52 localhost nova_compute[237281]: 2025-12-06 10:04:52.906 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:53 localhost podman[197801]: time="2025-12-06T10:04:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:04:53 localhost podman[197801]: @ - - [06/Dec/2025:10:04:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:04:53 localhost podman[197801]: @ - - [06/Dec/2025:10:04:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15462 "" "Go-http-client/1.1" Dec 6 05:04:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:04:53 localhost podman[241895]: 2025-12-06 10:04:53.553163725 +0000 UTC m=+0.081703079 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:04:53 localhost podman[241895]: 2025-12-06 10:04:53.563306616 +0000 UTC m=+0.091845970 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:04:53 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:04:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38154 DF PROTO=TCP SPT=43516 DPT=9102 SEQ=2953144537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD5D7A0000000001030307) Dec 6 05:04:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38155 DF PROTO=TCP SPT=43516 DPT=9102 SEQ=2953144537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD61880000000001030307) Dec 6 05:04:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29354 DF PROTO=TCP SPT=43734 DPT=9102 SEQ=561670560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD63870000000001030307) Dec 6 05:04:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38156 DF PROTO=TCP SPT=43516 DPT=9102 SEQ=2953144537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD69880000000001030307) Dec 6 05:04:57 localhost nova_compute[237281]: 2025-12-06 10:04:57.147 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:57 localhost nova_compute[237281]: 2025-12-06 10:04:57.909 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:04:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33408 DF PROTO=TCP SPT=54688 DPT=9102 SEQ=147392630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD6D880000000001030307) Dec 6 05:05:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38157 DF PROTO=TCP SPT=43516 DPT=9102 SEQ=2953144537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD79470000000001030307) Dec 6 05:05:02 localhost nova_compute[237281]: 2025-12-06 10:05:02.150 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:02 localhost nova_compute[237281]: 2025-12-06 10:05:02.911 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:05:03 localhost podman[241916]: 2025-12-06 10:05:03.531417573 +0000 UTC m=+0.071095575 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2) Dec 6 05:05:03 localhost podman[241916]: 2025-12-06 10:05:03.573584912 +0000 UTC m=+0.113262864 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:05:03 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:05:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:05:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:05:06 localhost systemd[1]: tmp-crun.y2zzau.mount: Deactivated successfully. Dec 6 05:05:06 localhost podman[241941]: 2025-12-06 10:05:06.574150599 +0000 UTC m=+0.101670930 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:05:06 localhost podman[241941]: 2025-12-06 10:05:06.605787146 +0000 UTC m=+0.133307437 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:05:06 localhost systemd[1]: tmp-crun.qsGE2D.mount: Deactivated successfully. Dec 6 05:05:06 localhost podman[241942]: 2025-12-06 10:05:06.617752513 +0000 UTC m=+0.141360295 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd) Dec 6 05:05:06 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:05:06 localhost podman[241942]: 2025-12-06 10:05:06.63010722 +0000 UTC m=+0.153715002 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:05:06 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:05:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:05:06.686 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:05:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:05:06.686 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:05:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:05:06.687 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:05:07 localhost nova_compute[237281]: 2025-12-06 10:05:07.153 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:07 localhost nova_compute[237281]: 2025-12-06 10:05:07.912 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38158 DF PROTO=TCP SPT=43516 DPT=9102 SEQ=2953144537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCD99870000000001030307) Dec 6 05:05:12 localhost nova_compute[237281]: 2025-12-06 10:05:12.210 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:12 localhost nova_compute[237281]: 2025-12-06 10:05:12.914 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:05:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:05:14 localhost podman[241983]: 2025-12-06 10:05:14.553557253 +0000 UTC m=+0.085494925 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:05:14 localhost podman[241983]: 2025-12-06 10:05:14.587427219 +0000 UTC m=+0.119364961 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:05:14 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:05:14 localhost podman[241984]: 2025-12-06 10:05:14.662044461 +0000 UTC m=+0.188044812 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:05:14 localhost podman[241984]: 2025-12-06 10:05:14.671162619 +0000 UTC m=+0.197162911 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:05:14 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:05:16 localhost openstack_network_exporter[199751]: ERROR 10:05:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:05:16 localhost openstack_network_exporter[199751]: ERROR 10:05:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:05:16 localhost openstack_network_exporter[199751]: ERROR 10:05:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:05:16 localhost openstack_network_exporter[199751]: ERROR 10:05:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:05:16 localhost openstack_network_exporter[199751]: Dec 6 05:05:16 localhost openstack_network_exporter[199751]: ERROR 10:05:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:05:16 localhost openstack_network_exporter[199751]: Dec 6 05:05:17 localhost nova_compute[237281]: 2025-12-06 10:05:17.252 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:17 localhost nova_compute[237281]: 2025-12-06 10:05:17.916 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:22 localhost nova_compute[237281]: 2025-12-06 10:05:22.287 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:05:22 localhost podman[242021]: 2025-12-06 10:05:22.545477842 +0000 UTC m=+0.079191622 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:05:22 localhost podman[242021]: 2025-12-06 10:05:22.557362146 +0000 UTC m=+0.091075926 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 05:05:22 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:05:22 localhost nova_compute[237281]: 2025-12-06 10:05:22.918 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:23 localhost podman[197801]: time="2025-12-06T10:05:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:05:23 localhost podman[197801]: @ - - [06/Dec/2025:10:05:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:05:23 localhost podman[197801]: @ - - [06/Dec/2025:10:05:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15460 "" "Go-http-client/1.1" Dec 6 05:05:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46761 DF PROTO=TCP SPT=50812 DPT=9102 SEQ=4223708546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCDD2AA0000000001030307) Dec 6 05:05:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:05:24 localhost podman[242042]: 2025-12-06 10:05:24.54880343 +0000 UTC m=+0.081547524 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:05:24 localhost podman[242042]: 2025-12-06 10:05:24.560353984 +0000 UTC m=+0.093098098 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:05:24 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:05:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46762 DF PROTO=TCP SPT=50812 DPT=9102 SEQ=4223708546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCDD6C80000000001030307) Dec 6 05:05:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38159 DF PROTO=TCP SPT=43516 DPT=9102 SEQ=2953144537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCDD9870000000001030307) Dec 6 05:05:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46763 DF PROTO=TCP SPT=50812 DPT=9102 SEQ=4223708546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCDDEC70000000001030307) Dec 6 05:05:27 localhost nova_compute[237281]: 2025-12-06 10:05:27.321 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29355 DF PROTO=TCP SPT=43734 DPT=9102 SEQ=561670560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCDE1870000000001030307) Dec 6 05:05:27 localhost nova_compute[237281]: 2025-12-06 10:05:27.923 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46764 DF PROTO=TCP SPT=50812 DPT=9102 SEQ=4223708546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCDEE870000000001030307) Dec 6 05:05:32 localhost nova_compute[237281]: 2025-12-06 10:05:32.358 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:32 localhost nova_compute[237281]: 2025-12-06 10:05:32.924 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:05:34 localhost podman[242064]: 2025-12-06 10:05:34.534224166 +0000 UTC m=+0.071025563 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:05:34 localhost podman[242064]: 2025-12-06 10:05:34.594789988 +0000 UTC m=+0.131591375 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3) Dec 6 05:05:34 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:05:34 localhost nova_compute[237281]: 2025-12-06 10:05:34.862 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:34 localhost nova_compute[237281]: 2025-12-06 10:05:34.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:34 localhost nova_compute[237281]: 2025-12-06 10:05:34.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:05:34 localhost nova_compute[237281]: 2025-12-06 10:05:34.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:05:35 localhost nova_compute[237281]: 2025-12-06 10:05:35.234 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:05:35 localhost nova_compute[237281]: 2025-12-06 10:05:35.234 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:05:35 localhost nova_compute[237281]: 2025-12-06 10:05:35.234 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:05:35 localhost nova_compute[237281]: 2025-12-06 10:05:35.235 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:05:36 localhost nova_compute[237281]: 2025-12-06 10:05:36.541 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:05:36 localhost nova_compute[237281]: 2025-12-06 10:05:36.557 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:05:36 localhost nova_compute[237281]: 2025-12-06 10:05:36.558 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:05:36 localhost nova_compute[237281]: 2025-12-06 10:05:36.559 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:36 localhost nova_compute[237281]: 2025-12-06 10:05:36.559 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:36 localhost nova_compute[237281]: 2025-12-06 10:05:36.559 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:05:37 localhost nova_compute[237281]: 2025-12-06 10:05:37.390 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:05:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:05:37 localhost podman[242090]: 2025-12-06 10:05:37.546616085 +0000 UTC m=+0.075285354 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:05:37 localhost podman[242090]: 2025-12-06 10:05:37.555610219 +0000 UTC m=+0.084279558 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:05:37 localhost nova_compute[237281]: 2025-12-06 10:05:37.555 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:37 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:05:37 localhost podman[242089]: 2025-12-06 10:05:37.601463902 +0000 UTC m=+0.134091742 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:05:37 localhost podman[242089]: 2025-12-06 10:05:37.611231431 +0000 UTC m=+0.143859311 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:05:37 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:05:37 localhost nova_compute[237281]: 2025-12-06 10:05:37.880 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:37 localhost nova_compute[237281]: 2025-12-06 10:05:37.910 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:37 localhost nova_compute[237281]: 2025-12-06 10:05:37.926 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46765 DF PROTO=TCP SPT=50812 DPT=9102 SEQ=4223708546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE0F880000000001030307) Dec 6 05:05:39 localhost nova_compute[237281]: 2025-12-06 10:05:39.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:40 localhost nova_compute[237281]: 2025-12-06 10:05:40.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:41 localhost nova_compute[237281]: 2025-12-06 10:05:41.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:05:41 localhost nova_compute[237281]: 2025-12-06 10:05:41.909 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:05:41 localhost nova_compute[237281]: 2025-12-06 10:05:41.910 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:05:41 localhost nova_compute[237281]: 2025-12-06 10:05:41.911 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:05:41 localhost nova_compute[237281]: 2025-12-06 10:05:41.911 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:05:41 localhost nova_compute[237281]: 2025-12-06 10:05:41.975 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.037 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.038 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.089 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.090 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.131 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.133 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.206 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.420 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.467 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.469 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12668MB free_disk=387.3091049194336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.470 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.471 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.570 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.570 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.571 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.626 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.645 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.647 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.647 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:05:42 localhost nova_compute[237281]: 2025-12-06 10:05:42.929 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:05:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:05:45 localhost podman[242142]: 2025-12-06 10:05:45.560619605 +0000 UTC m=+0.091091627 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:05:45 localhost podman[242142]: 2025-12-06 10:05:45.574308554 +0000 UTC m=+0.104780616 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:05:45 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:05:45 localhost systemd[1]: tmp-crun.pmLiNC.mount: Deactivated successfully. Dec 6 05:05:45 localhost podman[242141]: 2025-12-06 10:05:45.653676961 +0000 UTC m=+0.187176046 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:05:45 localhost podman[242141]: 2025-12-06 10:05:45.690907289 +0000 UTC m=+0.224406384 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:05:45 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:05:46 localhost openstack_network_exporter[199751]: ERROR 10:05:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:05:46 localhost openstack_network_exporter[199751]: ERROR 10:05:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:05:46 localhost openstack_network_exporter[199751]: ERROR 10:05:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:05:46 localhost openstack_network_exporter[199751]: ERROR 10:05:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:05:46 localhost openstack_network_exporter[199751]: Dec 6 05:05:46 localhost openstack_network_exporter[199751]: ERROR 10:05:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:05:46 localhost openstack_network_exporter[199751]: Dec 6 05:05:47 localhost nova_compute[237281]: 2025-12-06 10:05:47.456 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:47 localhost nova_compute[237281]: 2025-12-06 10:05:47.931 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:52 localhost nova_compute[237281]: 2025-12-06 10:05:52.493 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:52 localhost nova_compute[237281]: 2025-12-06 10:05:52.933 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:53 localhost podman[197801]: time="2025-12-06T10:05:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:05:53 localhost podman[197801]: @ - - [06/Dec/2025:10:05:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:05:53 localhost podman[197801]: @ - - [06/Dec/2025:10:05:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15460 "" "Go-http-client/1.1" Dec 6 05:05:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:05:53 localhost systemd[1]: tmp-crun.nLgsMq.mount: Deactivated successfully. Dec 6 05:05:53 localhost podman[242177]: 2025-12-06 10:05:53.562776956 +0000 UTC m=+0.097396419 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter) Dec 6 05:05:53 localhost podman[242177]: 2025-12-06 10:05:53.578302541 +0000 UTC m=+0.112922004 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, version=9.6) Dec 6 05:05:53 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:05:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46854 DF PROTO=TCP SPT=37396 DPT=9102 SEQ=3100792079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE47DA0000000001030307) Dec 6 05:05:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46855 DF PROTO=TCP SPT=37396 DPT=9102 SEQ=3100792079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE4BC80000000001030307) Dec 6 05:05:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:05:55 localhost podman[242197]: 2025-12-06 10:05:55.546925217 +0000 UTC m=+0.081170633 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:05:55 localhost podman[242197]: 2025-12-06 10:05:55.55752311 +0000 UTC m=+0.091768536 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:05:55 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:05:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46766 DF PROTO=TCP SPT=50812 DPT=9102 SEQ=4223708546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE4F870000000001030307) Dec 6 05:05:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46856 DF PROTO=TCP SPT=37396 DPT=9102 SEQ=3100792079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE53C70000000001030307) Dec 6 05:05:57 localhost nova_compute[237281]: 2025-12-06 10:05:57.526 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:57 localhost nova_compute[237281]: 2025-12-06 10:05:57.938 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:05:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38160 DF PROTO=TCP SPT=43516 DPT=9102 SEQ=2953144537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE57870000000001030307) Dec 6 05:05:58 localhost sshd[242220]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:05:59 localhost sshd[242222]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:00 localhost sshd[242224]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46857 DF PROTO=TCP SPT=37396 DPT=9102 SEQ=3100792079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE63880000000001030307) Dec 6 05:06:01 localhost sshd[242226]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:02 localhost nova_compute[237281]: 2025-12-06 10:06:02.559 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:02 localhost sshd[242228]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:02 localhost nova_compute[237281]: 2025-12-06 10:06:02.937 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:03 localhost sshd[242230]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:06:05 localhost podman[242232]: 2025-12-06 10:06:05.454258923 +0000 UTC m=+0.077109418 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:06:05 localhost podman[242232]: 2025-12-06 10:06:05.489742139 +0000 UTC m=+0.112592594 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:06:05 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:06:05 localhost sshd[242256]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:06 localhost sshd[242258]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:06:06.687 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:06:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:06:06.688 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:06:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:06:06.689 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:06:07 localhost nova_compute[237281]: 2025-12-06 10:06:07.591 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:07 localhost sshd[242260]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:07 localhost nova_compute[237281]: 2025-12-06 10:06:07.939 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:06:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:06:08 localhost podman[242262]: 2025-12-06 10:06:08.326048272 +0000 UTC m=+0.083123542 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:06:08 localhost podman[242262]: 2025-12-06 10:06:08.336216643 +0000 UTC m=+0.093291933 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:06:08 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:06:08 localhost podman[242263]: 2025-12-06 10:06:08.384402587 +0000 UTC m=+0.136056111 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd) Dec 6 05:06:08 localhost podman[242263]: 2025-12-06 10:06:08.39235871 +0000 UTC m=+0.144012234 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Dec 6 05:06:08 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:06:08 localhost sshd[242304]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46858 DF PROTO=TCP SPT=37396 DPT=9102 SEQ=3100792079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCE83870000000001030307) Dec 6 05:06:09 localhost sshd[242306]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:10 localhost sshd[242308]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:12 localhost sshd[242310]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:12 localhost nova_compute[237281]: 2025-12-06 10:06:12.631 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:12 localhost nova_compute[237281]: 2025-12-06 10:06:12.940 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:13 localhost sshd[242312]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:14 localhost sshd[242314]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:06:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:06:15 localhost podman[242316]: 2025-12-06 10:06:15.937820995 +0000 UTC m=+0.085130535 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:06:15 localhost podman[242316]: 2025-12-06 10:06:15.977258542 +0000 UTC m=+0.124568032 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 05:06:15 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:06:15 localhost sshd[242347]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:16 localhost systemd[1]: tmp-crun.ijpxXe.mount: Deactivated successfully. Dec 6 05:06:16 localhost podman[242317]: 2025-12-06 10:06:16.013114777 +0000 UTC m=+0.158368964 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:06:16 localhost podman[242317]: 2025-12-06 10:06:16.027295041 +0000 UTC m=+0.172549238 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:06:16 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:06:16 localhost openstack_network_exporter[199751]: ERROR 10:06:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:06:16 localhost openstack_network_exporter[199751]: ERROR 10:06:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:06:16 localhost openstack_network_exporter[199751]: ERROR 10:06:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:06:16 localhost openstack_network_exporter[199751]: ERROR 10:06:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:06:16 localhost openstack_network_exporter[199751]: Dec 6 05:06:16 localhost openstack_network_exporter[199751]: ERROR 10:06:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:06:16 localhost openstack_network_exporter[199751]: Dec 6 05:06:16 localhost sshd[242355]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:17 localhost nova_compute[237281]: 2025-12-06 10:06:17.675 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:17 localhost nova_compute[237281]: 2025-12-06 10:06:17.945 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:18 localhost sshd[242357]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:19 localhost sshd[242359]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:20 localhost sshd[242361]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:21 localhost sshd[242363]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:22 localhost nova_compute[237281]: 2025-12-06 10:06:22.712 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:22 localhost nova_compute[237281]: 2025-12-06 10:06:22.950 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:22.988 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:06:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:22.989 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.006 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.006 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdb7e484-739d-4bdc-9448-be7c16336278', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:22.989659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '37fc7cd6-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.199751334, 'message_signature': '2b8a4c60580d0d83b12e99f0ec820614e2d240cf5ec55d51af48c95e11176abe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:22.989659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '37fc95c2-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.199751334, 'message_signature': '39aec7f34b1b83ef66e387da381761bf97d104e6cedac39dce8cf92775f79570'}]}, 'timestamp': '2025-12-06 10:06:23.007535', '_unique_id': 'e2985cb80e2349c9a4b5b8bb7b316f02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.009 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.054 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.055 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b994c56-013b-4cef-a24a-d2f42502ece5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.010516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3803d72e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': 'b9aa1ea12efd78953933dfe99058ead851fbf52e72e4f93a1ffc40e125fcbce7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.010516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3803ef5c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': '13200077b08986a236d3789ea0ca5f32e2da5403b0c5cec2551b049c8c65184b'}]}, 'timestamp': '2025-12-06 10:06:23.055703', '_unique_id': 'a4310cc44bcf47a093f7f6eab4ffcc18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.057 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.058 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.059 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08619d5d-7110-437f-b005-97d548e6e674', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.058584', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38047328-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': 'c33410bb37ce975a48240b903de4aac159db9c8a5f8fbbd8e51cdc2c91c4ee06'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.058584', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '380486a6-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': 'a3eb0141fe131c4b9b34c144fb2b98f711b453df6fc766fa07e6d7aca414c945'}]}, 'timestamp': '2025-12-06 10:06:23.059562', '_unique_id': '469d6c2e7d03487f9d7982a92e7baa5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.060 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.061 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.082 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff23c854-93d9-4583-9b14-728e1806c99c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:06:23.061874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '380812da-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.292288824, 'message_signature': '8661fe5602b70b40a382282e03cc19c91ab06c87dbcc86dcfebf1b9cb6198656'}]}, 'timestamp': '2025-12-06 10:06:23.082802', '_unique_id': '2f8da265aa48485fb933a10fb02c3693'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.088 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '668ca5ce-f8d5-4ff5-aae3-f16a9b47123e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.085124', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '3808f93e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': 'e90a075ec44bc9c381a6c170e50e4232744d0dae962579281a480d528b459784'}]}, 'timestamp': '2025-12-06 10:06:23.088713', '_unique_id': '19078b7f665a4df3bed0c9d9f5ab22e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.089 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.090 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca12e949-193f-4a70-afcf-f0a867be6999', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.091217', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '38096fd6-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': '3905a0b1332adf607fa75b7ba7b2c386c113396da0ea4b29b5504e1037b1262b'}]}, 'timestamp': '2025-12-06 10:06:23.091742', '_unique_id': 'b45cbb9adbed49cba5f67cab9457d250'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.094 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '474718dc-3a8d-4912-80d8-e55aad009108', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.093966', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '3809d958-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': '17b6b71455c289dd9f150832068a47fa5aa3b07a308661bf79fa426b68a306b0'}]}, 'timestamp': '2025-12-06 10:06:23.094465', '_unique_id': '9a5f50b906d4465a8d829dda1dfcbd21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.096 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8618f33d-0017-4a50-9667-cc7deb1b5cee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.096731', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '380a4654-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': 'f5da90062d5bfc6e4b5b7e68a5acf76ea616c456101dbc80512b71ef0670dc94'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.096731', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '380a577a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': '31761c99f85ef007219000140aa2aeb56b7017f7ba602981ce9e26f3acfde7b9'}]}, 'timestamp': '2025-12-06 10:06:23.097642', '_unique_id': 'c7f80e4069154a4d859f54216aed44b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c111adb7-1867-451a-92e0-d06ead2e9abc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.099971', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '380ac3a4-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.199751334, 'message_signature': '91cebd2b8a971cc7fb5e42e9e90335afb15c00e23591fd0e780a16f223b549f2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.099971', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '380ad3da-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.199751334, 'message_signature': 'c3803f4edcee3549fda4a2736bdc30bc1158398c0bf281fe69f5b65aba403513'}]}, 'timestamp': '2025-12-06 10:06:23.100822', '_unique_id': 'e41b3c9f9ffa4025b0a76f7df40a5b78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddba95e8-698b-4984-9290-5e51105965a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.103027', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '380b3c9e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': 'f284a759417718d015adfdfa9c2cf25374616df99fb6c3a0bf2d2f7b18ed0d4d'}]}, 'timestamp': '2025-12-06 10:06:23.103534', '_unique_id': '2a6126bd90fe451abf4e7a09132071d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.105 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6d1ad32-20be-46db-8dcf-5160d29b6e55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.105741', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '380ba68e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': 'aa77aaddf426d0c35c3cca3e7032ab0ed03c7785ee1a5b68c1d9d779f7f7e07c'}]}, 'timestamp': '2025-12-06 10:06:23.106250', '_unique_id': '08f19f09773a4fc5a5653ee9bd243698'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.108 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1fcfe47-0a84-471b-a449-3a42327df34a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.108600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '380c14f2-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': '1f89a6d1695c5ca69f29b67823a9563a243a53c7b8812ddb3caac71ff883537a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.108600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '380c26ea-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': '8b81547d2ceae136665e762bbec9b93e342eb1d88c4ddec4fb6fab249648a961'}]}, 'timestamp': '2025-12-06 10:06:23.109504', '_unique_id': '12530ba76cf94a86b39923879a80429a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '339ef919-9da6-44b9-94a9-0d23e1a6e8bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.112032', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '380c9b52-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': '657422b24c5e0a5f72434c28d0fc72e2304a1a4b3a63b39545e240311eb40be4'}]}, 'timestamp': '2025-12-06 10:06:23.112564', '_unique_id': '3128f64a98374aea939301d20e9ba025'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.114 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4b4eb63-d682-4218-a80d-a50f6a7b91e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.114893', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '380d0bc8-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': 'd42b709bdbb949122e68f9c8aa7c47e9e645f88c47ae4855efc307c8119127aa'}]}, 'timestamp': '2025-12-06 10:06:23.115430', '_unique_id': 'ebffe1398b70473181694153a0becdca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3a44ac2-d19f-4dd9-a67b-9794e57a442c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.117745', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '380d7b8a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': '0a0b5728abafa167e0ce8d2642911afdeb567adfa85dfdb28b89140623f26ea3'}]}, 'timestamp': '2025-12-06 10:06:23.118256', '_unique_id': '2eeabc77ee2641d5a53c981285419db8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2ff59b7-9e38-4d59-8199-3f1ec1055ced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.120462', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '380de430-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': 'fba8c8e2f984e30e3351bee5fd7e16f8bf6346ee947a7eb0e35e0e15febff54f'}]}, 'timestamp': '2025-12-06 10:06:23.120966', '_unique_id': '5a87bcc9040e460ea746057dc61ac091'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fb2f0ec-d86c-4a17-b370-af1e967deeea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.123134', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '380e4ce0-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.199751334, 'message_signature': '69fe564a785619e5f052802bb9afddd037cce48a174462ff4211b95abf61fe06'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.123134', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '380e5d7a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.199751334, 'message_signature': '55db37f3aaf236ad9f027f0f435cb1ec9d2d1a74a82776111b4711a4bd272444'}]}, 'timestamp': '2025-12-06 10:06:23.124041', '_unique_id': 'cecb84d9f9b44acaa6809d30f506d3b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.126 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 14210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16be6f61-605e-43ce-9894-9dbee82a4ef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14210000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:06:23.126273', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '380ec6f2-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.292288824, 'message_signature': '1d2bbaec36acfea4c8f509e16def4cf19bd37a43a0f24ea26376bb25967d2a4a'}]}, 'timestamp': '2025-12-06 10:06:23.126723', '_unique_id': '4e00b893f5184da8a99780d593532d0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.128 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d8e449d-10bf-4010-91b4-8a53e5f3a478', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:06:23.128727', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '380f2674-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.295233584, 'message_signature': 'f5cd87e2b680c0ba3fb7adb67c73f4dfb936bded96c54467f3d489bf4279fa06'}]}, 'timestamp': '2025-12-06 10:06:23.129096', '_unique_id': '6de594b3f152429da491531594a2ae50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '340263e8-b8cc-492b-951b-27530570ee2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.130419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '380f658a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': '8ddd20dd49ab81a52dc233742663a4b0f79fad7bee2db1937c0e17ce9b409e98'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.130419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '380f6fda-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': '15ebcd3fdc44cf2649480b39fc1a9b966f964a4306671ddd6dfcb1e60c25cae6'}]}, 'timestamp': '2025-12-06 10:06:23.130976', '_unique_id': '90fc2511951d4030aff115ad2d120c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cb5dc84-e4ce-47b1-bff7-0dcccaebb3b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:06:23.132439', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '380fb47c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': 'c9c98d1dcf0f9ed293d00683f71ad708ec61c71af2a16b2cba61234075b045c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:06:23.132439', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '380fbe72-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 11970.220655493, 'message_signature': '7eee00d227ba22b9255793072858e9de3266a2dca226dab2652d76eb9dbfde08'}]}, 'timestamp': '2025-12-06 10:06:23.132985', '_unique_id': '7bb1671f2b2b4c188a050fb57356d1c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:06:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:06:23.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:06:23 localhost podman[197801]: time="2025-12-06T10:06:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:06:23 localhost podman[197801]: @ - - [06/Dec/2025:10:06:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:06:23 localhost podman[197801]: @ - - [06/Dec/2025:10:06:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15459 "" "Go-http-client/1.1" Dec 6 05:06:23 localhost sshd[242365]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45778 DF PROTO=TCP SPT=46282 DPT=9102 SEQ=478220876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCEBD0A0000000001030307) Dec 6 05:06:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:06:24 localhost podman[242367]: 2025-12-06 10:06:24.258254151 +0000 UTC m=+0.082721231 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:06:24 localhost podman[242367]: 2025-12-06 10:06:24.295301184 +0000 UTC m=+0.119768224 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7) Dec 6 05:06:24 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:06:24 localhost sshd[242388]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45779 DF PROTO=TCP SPT=46282 DPT=9102 SEQ=478220876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCEC1080000000001030307) Dec 6 05:06:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:06:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46859 DF PROTO=TCP SPT=37396 DPT=9102 SEQ=3100792079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCEC3870000000001030307) Dec 6 05:06:25 localhost podman[242390]: 2025-12-06 10:06:25.681046253 +0000 UTC m=+0.079094449 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:06:25 localhost podman[242390]: 2025-12-06 10:06:25.690181104 +0000 UTC m=+0.088229330 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:06:25 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:06:25 localhost sshd[242414]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45780 DF PROTO=TCP SPT=46282 DPT=9102 SEQ=478220876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCEC9070000000001030307) Dec 6 05:06:27 localhost sshd[242416]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:27 localhost nova_compute[237281]: 2025-12-06 10:06:27.748 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:27 localhost nova_compute[237281]: 2025-12-06 10:06:27.951 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46767 DF PROTO=TCP SPT=50812 DPT=9102 SEQ=4223708546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCECD870000000001030307) Dec 6 05:06:28 localhost sshd[242418]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:29 localhost sshd[242420]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:30 localhost sshd[242422]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45781 DF PROTO=TCP SPT=46282 DPT=9102 SEQ=478220876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCED8C80000000001030307) Dec 6 05:06:31 localhost sshd[242424]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:32 localhost nova_compute[237281]: 2025-12-06 10:06:31.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:32 localhost nova_compute[237281]: 2025-12-06 10:06:32.776 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:32 localhost nova_compute[237281]: 2025-12-06 10:06:32.955 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:33 localhost sshd[242426]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:33 localhost nova_compute[237281]: 2025-12-06 10:06:33.903 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:34 localhost sshd[242428]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:34 localhost nova_compute[237281]: 2025-12-06 10:06:34.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:34 localhost nova_compute[237281]: 2025-12-06 10:06:34.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:06:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:06:35 localhost podman[242430]: 2025-12-06 10:06:35.624806065 +0000 UTC m=+0.078073358 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:06:35 localhost podman[242430]: 2025-12-06 10:06:35.663290893 +0000 UTC m=+0.116558196 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller) Dec 6 05:06:35 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:06:35 localhost nova_compute[237281]: 2025-12-06 10:06:35.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:35 localhost nova_compute[237281]: 2025-12-06 10:06:35.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:06:35 localhost nova_compute[237281]: 2025-12-06 10:06:35.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:06:35 localhost sshd[242456]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:36 localhost nova_compute[237281]: 2025-12-06 10:06:36.415 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:06:36 localhost nova_compute[237281]: 2025-12-06 10:06:36.415 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:06:36 localhost nova_compute[237281]: 2025-12-06 10:06:36.415 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:06:36 localhost nova_compute[237281]: 2025-12-06 10:06:36.416 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:06:36 localhost sshd[242458]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:37 localhost nova_compute[237281]: 2025-12-06 10:06:37.819 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:37 localhost nova_compute[237281]: 2025-12-06 10:06:37.959 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:06:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:06:38 localhost podman[242460]: 2025-12-06 10:06:38.548835242 +0000 UTC m=+0.076744929 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:06:38 localhost podman[242460]: 2025-12-06 10:06:38.559311403 +0000 UTC m=+0.087221080 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:06:38 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:06:38 localhost podman[242461]: 2025-12-06 10:06:38.613919862 +0000 UTC m=+0.136732993 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:06:38 localhost podman[242461]: 2025-12-06 10:06:38.624732953 +0000 UTC m=+0.147546084 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:06:38 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:06:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45782 DF PROTO=TCP SPT=46282 DPT=9102 SEQ=478220876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCEF9880000000001030307) Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.599 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.627 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.628 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.628 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.628 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.629 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.907 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.908 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.908 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.909 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:06:41 localhost nova_compute[237281]: 2025-12-06 10:06:41.990 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.064 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.066 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.123 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.124 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.175 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.177 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.253 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.418 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.419 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12670MB free_disk=387.30888748168945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.419 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.419 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.556 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.556 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.556 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.619 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.696 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.697 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.717 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.755 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.818 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.841 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.844 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.845 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.860 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:42 localhost nova_compute[237281]: 2025-12-06 10:06:42.961 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:43 localhost nova_compute[237281]: 2025-12-06 10:06:43.845 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:44 localhost nova_compute[237281]: 2025-12-06 10:06:44.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:44 localhost nova_compute[237281]: 2025-12-06 10:06:44.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 05:06:44 localhost nova_compute[237281]: 2025-12-06 10:06:44.915 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 05:06:46 localhost openstack_network_exporter[199751]: ERROR 10:06:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:06:46 localhost openstack_network_exporter[199751]: ERROR 10:06:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:06:46 localhost openstack_network_exporter[199751]: ERROR 10:06:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:06:46 localhost openstack_network_exporter[199751]: ERROR 10:06:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:06:46 localhost openstack_network_exporter[199751]: Dec 6 05:06:46 localhost openstack_network_exporter[199751]: ERROR 10:06:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:06:46 localhost openstack_network_exporter[199751]: Dec 6 05:06:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:06:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:06:46 localhost podman[242514]: 2025-12-06 10:06:46.570669234 +0000 UTC m=+0.093794809 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute) Dec 6 05:06:46 localhost podman[242514]: 2025-12-06 10:06:46.582406852 +0000 UTC m=+0.105532427 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:06:46 localhost podman[242513]: 2025-12-06 10:06:46.621025334 +0000 UTC m=+0.146048068 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:06:46 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:06:46 localhost podman[242513]: 2025-12-06 10:06:46.650562618 +0000 UTC m=+0.175585322 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true) Dec 6 05:06:46 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:06:47 localhost nova_compute[237281]: 2025-12-06 10:06:47.862 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:47 localhost nova_compute[237281]: 2025-12-06 10:06:47.963 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:48 localhost nova_compute[237281]: 2025-12-06 10:06:48.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:06:48 localhost nova_compute[237281]: 2025-12-06 10:06:48.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 05:06:51 localhost sshd[242549]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:06:52 localhost nova_compute[237281]: 2025-12-06 10:06:52.911 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:52 localhost nova_compute[237281]: 2025-12-06 10:06:52.965 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:53 localhost podman[197801]: time="2025-12-06T10:06:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:06:53 localhost podman[197801]: @ - - [06/Dec/2025:10:06:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:06:53 localhost podman[197801]: @ - - [06/Dec/2025:10:06:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15469 "" "Go-http-client/1.1" Dec 6 05:06:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44157 DF PROTO=TCP SPT=45854 DPT=9102 SEQ=331947997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCF32390000000001030307) Dec 6 05:06:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:06:54 localhost podman[242551]: 2025-12-06 10:06:54.553464565 +0000 UTC m=+0.086062944 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.) Dec 6 05:06:54 localhost podman[242551]: 2025-12-06 10:06:54.565297556 +0000 UTC m=+0.097895945 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible) Dec 6 05:06:54 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:06:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44158 DF PROTO=TCP SPT=45854 DPT=9102 SEQ=331947997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCF36470000000001030307) Dec 6 05:06:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45783 DF PROTO=TCP SPT=46282 DPT=9102 SEQ=478220876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCF39880000000001030307) Dec 6 05:06:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:06:56 localhost podman[242570]: 2025-12-06 10:06:56.540760392 +0000 UTC m=+0.078045008 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:06:56 localhost podman[242570]: 2025-12-06 10:06:56.55113611 +0000 UTC m=+0.088420726 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:06:56 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:06:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44159 DF PROTO=TCP SPT=45854 DPT=9102 SEQ=331947997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCF3E470000000001030307) Dec 6 05:06:57 localhost nova_compute[237281]: 2025-12-06 10:06:57.914 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:06:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46860 DF PROTO=TCP SPT=37396 DPT=9102 SEQ=3100792079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCF41870000000001030307) Dec 6 05:06:57 localhost nova_compute[237281]: 2025-12-06 10:06:57.967 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44160 DF PROTO=TCP SPT=45854 DPT=9102 SEQ=331947997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCF4E070000000001030307) Dec 6 05:07:02 localhost nova_compute[237281]: 2025-12-06 10:07:02.959 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:02 localhost nova_compute[237281]: 2025-12-06 10:07:02.970 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:07:06 localhost podman[242595]: 2025-12-06 10:07:06.55174963 +0000 UTC m=+0.080038440 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:07:06 localhost podman[242595]: 2025-12-06 10:07:06.590786273 +0000 UTC m=+0.119075083 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:07:06 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:07:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:07:06.689 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:07:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:07:06.690 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:07:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:07:06.691 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:07:07 localhost nova_compute[237281]: 2025-12-06 10:07:07.961 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:07 localhost nova_compute[237281]: 2025-12-06 10:07:07.973 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44161 DF PROTO=TCP SPT=45854 DPT=9102 SEQ=331947997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCF6D870000000001030307) Dec 6 05:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:07:09 localhost podman[242620]: 2025-12-06 10:07:09.539938039 +0000 UTC m=+0.067887618 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd) Dec 6 05:07:09 localhost podman[242620]: 2025-12-06 10:07:09.550903753 +0000 UTC m=+0.078853342 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 6 05:07:09 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:07:09 localhost podman[242619]: 2025-12-06 10:07:09.608881817 +0000 UTC m=+0.138531998 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:07:09 localhost podman[242619]: 2025-12-06 10:07:09.615956844 +0000 UTC m=+0.145607015 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:07:09 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:07:12 localhost nova_compute[237281]: 2025-12-06 10:07:12.974 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:12 localhost nova_compute[237281]: 2025-12-06 10:07:12.976 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:12 localhost nova_compute[237281]: 2025-12-06 10:07:12.977 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:07:12 localhost nova_compute[237281]: 2025-12-06 10:07:12.977 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:07:13 localhost nova_compute[237281]: 2025-12-06 10:07:12.999 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:13 localhost nova_compute[237281]: 2025-12-06 10:07:13.000 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:07:16 localhost openstack_network_exporter[199751]: ERROR 10:07:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:07:16 localhost openstack_network_exporter[199751]: ERROR 10:07:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:07:16 localhost openstack_network_exporter[199751]: ERROR 10:07:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:07:16 localhost openstack_network_exporter[199751]: ERROR 10:07:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:07:16 localhost openstack_network_exporter[199751]: Dec 6 05:07:16 localhost openstack_network_exporter[199751]: ERROR 10:07:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:07:16 localhost openstack_network_exporter[199751]: Dec 6 05:07:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:07:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:07:17 localhost podman[242664]: 2025-12-06 10:07:17.551676992 +0000 UTC m=+0.080472841 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:07:17 localhost podman[242664]: 2025-12-06 10:07:17.563081362 +0000 UTC m=+0.091877201 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:07:17 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:07:17 localhost podman[242663]: 2025-12-06 10:07:17.654108756 +0000 UTC m=+0.186411893 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:07:17 localhost podman[242663]: 2025-12-06 10:07:17.687987852 +0000 UTC m=+0.220290979 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:07:17 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:07:18 localhost nova_compute[237281]: 2025-12-06 10:07:18.002 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:18 localhost nova_compute[237281]: 2025-12-06 10:07:18.004 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:23 localhost nova_compute[237281]: 2025-12-06 10:07:23.004 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:23 localhost nova_compute[237281]: 2025-12-06 10:07:23.005 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:23 localhost nova_compute[237281]: 2025-12-06 10:07:23.006 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:07:23 localhost nova_compute[237281]: 2025-12-06 10:07:23.006 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:07:23 localhost nova_compute[237281]: 2025-12-06 10:07:23.035 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:23 localhost nova_compute[237281]: 2025-12-06 10:07:23.036 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:07:23 localhost podman[197801]: time="2025-12-06T10:07:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:07:23 localhost podman[197801]: @ - - [06/Dec/2025:10:07:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:07:23 localhost podman[197801]: @ - - [06/Dec/2025:10:07:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15468 "" "Go-http-client/1.1" Dec 6 05:07:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53191 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=2937584635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCFA7690000000001030307) Dec 6 05:07:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53192 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=2937584635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCFAB870000000001030307) Dec 6 05:07:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:07:25 localhost podman[242699]: 2025-12-06 10:07:25.531046947 +0000 UTC m=+0.066284699 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7) Dec 6 05:07:25 localhost podman[242699]: 2025-12-06 10:07:25.543330722 +0000 UTC m=+0.078568474 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal) Dec 6 05:07:25 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:07:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44162 DF PROTO=TCP SPT=45854 DPT=9102 SEQ=331947997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCFAD870000000001030307) Dec 6 05:07:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53193 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=2937584635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCFB3880000000001030307) Dec 6 05:07:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:07:27 localhost podman[242719]: 2025-12-06 10:07:27.542536074 +0000 UTC m=+0.072764806 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:07:27 localhost podman[242719]: 2025-12-06 10:07:27.549709453 +0000 UTC m=+0.079938135 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:07:27 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:07:28 localhost nova_compute[237281]: 2025-12-06 10:07:28.037 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45784 DF PROTO=TCP SPT=46282 DPT=9102 SEQ=478220876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCFB7870000000001030307) Dec 6 05:07:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53194 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=2937584635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCFC3470000000001030307) Dec 6 05:07:33 localhost nova_compute[237281]: 2025-12-06 10:07:33.040 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:36 localhost nova_compute[237281]: 2025-12-06 10:07:36.117 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:36 localhost nova_compute[237281]: 2025-12-06 10:07:36.117 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:36 localhost nova_compute[237281]: 2025-12-06 10:07:36.118 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:36 localhost nova_compute[237281]: 2025-12-06 10:07:36.118 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:07:36 localhost nova_compute[237281]: 2025-12-06 10:07:36.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:36 localhost nova_compute[237281]: 2025-12-06 10:07:36.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:07:36 localhost nova_compute[237281]: 2025-12-06 10:07:36.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:07:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:07:37 localhost nova_compute[237281]: 2025-12-06 10:07:37.474 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:07:37 localhost nova_compute[237281]: 2025-12-06 10:07:37.474 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:07:37 localhost nova_compute[237281]: 2025-12-06 10:07:37.475 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:07:37 localhost nova_compute[237281]: 2025-12-06 10:07:37.475 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:07:37 localhost systemd[1]: tmp-crun.Wu6H17.mount: Deactivated successfully. Dec 6 05:07:37 localhost podman[242743]: 2025-12-06 10:07:37.566931379 +0000 UTC m=+0.094659896 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:07:37 localhost podman[242743]: 2025-12-06 10:07:37.633381782 +0000 UTC m=+0.161110329 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:07:37 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:07:38 localhost nova_compute[237281]: 2025-12-06 10:07:38.043 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:38 localhost nova_compute[237281]: 2025-12-06 10:07:38.668 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:07:38 localhost nova_compute[237281]: 2025-12-06 10:07:38.686 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:07:38 localhost nova_compute[237281]: 2025-12-06 10:07:38.686 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:07:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53195 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=2937584635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DCFE3870000000001030307) Dec 6 05:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:07:40 localhost systemd[1]: tmp-crun.yvmgdX.mount: Deactivated successfully. Dec 6 05:07:40 localhost podman[242768]: 2025-12-06 10:07:40.563349348 +0000 UTC m=+0.092713536 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:07:40 localhost podman[242768]: 2025-12-06 10:07:40.569416253 +0000 UTC m=+0.098780471 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:07:40 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:07:40 localhost podman[242769]: 2025-12-06 10:07:40.66282649 +0000 UTC m=+0.188235528 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:07:40 localhost nova_compute[237281]: 2025-12-06 10:07:40.680 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:40 localhost nova_compute[237281]: 2025-12-06 10:07:40.681 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:40 localhost podman[242769]: 2025-12-06 10:07:40.702811503 +0000 UTC m=+0.228220521 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:07:40 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:07:40 localhost nova_compute[237281]: 2025-12-06 10:07:40.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:40 localhost nova_compute[237281]: 2025-12-06 10:07:40.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:42 localhost nova_compute[237281]: 2025-12-06 10:07:42.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:43 localhost nova_compute[237281]: 2025-12-06 10:07:43.047 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:43 localhost nova_compute[237281]: 2025-12-06 10:07:43.048 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:07:43 localhost nova_compute[237281]: 2025-12-06 10:07:43.048 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:07:43 localhost nova_compute[237281]: 2025-12-06 10:07:43.048 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:07:43 localhost nova_compute[237281]: 2025-12-06 10:07:43.081 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:43 localhost nova_compute[237281]: 2025-12-06 10:07:43.082 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:07:43 localhost nova_compute[237281]: 2025-12-06 10:07:43.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.201 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.201 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.202 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.202 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.267 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.341 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.342 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.395 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.397 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.474 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.475 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.542 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.743 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.745 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12664MB free_disk=387.30888748168945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.745 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.746 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.824 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.824 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.825 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.866 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.885 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.887 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:07:44 localhost nova_compute[237281]: 2025-12-06 10:07:44.888 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:07:46 localhost openstack_network_exporter[199751]: ERROR 10:07:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:07:46 localhost openstack_network_exporter[199751]: ERROR 10:07:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:07:46 localhost openstack_network_exporter[199751]: ERROR 10:07:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:07:46 localhost openstack_network_exporter[199751]: ERROR 10:07:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:07:46 localhost openstack_network_exporter[199751]: Dec 6 05:07:46 localhost openstack_network_exporter[199751]: ERROR 10:07:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:07:46 localhost openstack_network_exporter[199751]: Dec 6 05:07:48 localhost nova_compute[237281]: 2025-12-06 10:07:48.084 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:07:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:07:48 localhost systemd[1]: tmp-crun.6kOslt.mount: Deactivated successfully. Dec 6 05:07:48 localhost podman[242821]: 2025-12-06 10:07:48.544031434 +0000 UTC m=+0.077026407 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:07:48 localhost podman[242821]: 2025-12-06 10:07:48.552257486 +0000 UTC m=+0.085252509 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 6 05:07:48 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:07:48 localhost podman[242822]: 2025-12-06 10:07:48.600962965 +0000 UTC m=+0.130368768 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Dec 6 05:07:48 localhost podman[242822]: 2025-12-06 10:07:48.613314293 +0000 UTC m=+0.142720126 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:07:48 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:07:53 localhost nova_compute[237281]: 2025-12-06 10:07:53.085 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:53 localhost nova_compute[237281]: 2025-12-06 10:07:53.088 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:53 localhost podman[197801]: time="2025-12-06T10:07:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:07:53 localhost podman[197801]: @ - - [06/Dec/2025:10:07:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:07:53 localhost podman[197801]: @ - - [06/Dec/2025:10:07:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15471 "" "Go-http-client/1.1" Dec 6 05:07:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40326 DF PROTO=TCP SPT=57424 DPT=9102 SEQ=1961414207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD01C990000000001030307) Dec 6 05:07:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40327 DF PROTO=TCP SPT=57424 DPT=9102 SEQ=1961414207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD020870000000001030307) Dec 6 05:07:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53196 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=2937584635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD023880000000001030307) Dec 6 05:07:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:07:56 localhost podman[242854]: 2025-12-06 10:07:56.561638899 +0000 UTC m=+0.091402457 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 05:07:56 localhost podman[242854]: 2025-12-06 10:07:56.598561478 +0000 UTC m=+0.128324976 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public) Dec 6 05:07:56 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:07:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40328 DF PROTO=TCP SPT=57424 DPT=9102 SEQ=1961414207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD028870000000001030307) Dec 6 05:07:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44163 DF PROTO=TCP SPT=45854 DPT=9102 SEQ=331947997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD02B870000000001030307) Dec 6 05:07:58 localhost nova_compute[237281]: 2025-12-06 10:07:58.089 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:07:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:07:58 localhost podman[242875]: 2025-12-06 10:07:58.552877926 +0000 UTC m=+0.086928040 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:07:58 localhost podman[242875]: 2025-12-06 10:07:58.563183821 +0000 UTC m=+0.097233935 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:07:58 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:08:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40329 DF PROTO=TCP SPT=57424 DPT=9102 SEQ=1961414207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD038470000000001030307) Dec 6 05:08:03 localhost nova_compute[237281]: 2025-12-06 10:08:03.091 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:08:06.690 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:08:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:08:06.691 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:08:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:08:06.691 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:08:08 localhost nova_compute[237281]: 2025-12-06 10:08:08.094 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:08:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:08:08 localhost podman[242899]: 2025-12-06 10:08:08.55724717 +0000 UTC m=+0.086848067 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:08:08 localhost podman[242899]: 2025-12-06 10:08:08.59941722 +0000 UTC m=+0.129018137 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:08:08 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:08:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40330 DF PROTO=TCP SPT=57424 DPT=9102 SEQ=1961414207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD059880000000001030307) Dec 6 05:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:08:11 localhost podman[242925]: 2025-12-06 10:08:11.550876065 +0000 UTC m=+0.078184972 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:08:11 localhost podman[242925]: 2025-12-06 10:08:11.584356549 +0000 UTC m=+0.111665506 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:08:11 localhost systemd[1]: tmp-crun.2fGyfQ.mount: Deactivated successfully. Dec 6 05:08:11 localhost podman[242926]: 2025-12-06 10:08:11.604546486 +0000 UTC m=+0.128480449 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:08:11 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:08:11 localhost podman[242926]: 2025-12-06 10:08:11.610998555 +0000 UTC m=+0.134932548 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:08:11 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:08:13 localhost nova_compute[237281]: 2025-12-06 10:08:13.096 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:16 localhost openstack_network_exporter[199751]: ERROR 10:08:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:08:16 localhost openstack_network_exporter[199751]: ERROR 10:08:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:08:16 localhost openstack_network_exporter[199751]: ERROR 10:08:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:08:16 localhost openstack_network_exporter[199751]: ERROR 10:08:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:08:16 localhost openstack_network_exporter[199751]: Dec 6 05:08:16 localhost openstack_network_exporter[199751]: ERROR 10:08:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:08:16 localhost openstack_network_exporter[199751]: Dec 6 05:08:18 localhost nova_compute[237281]: 2025-12-06 10:08:18.098 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:08:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:08:19 localhost podman[242966]: 2025-12-06 10:08:19.54134779 +0000 UTC m=+0.073151199 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:08:19 localhost podman[242966]: 2025-12-06 10:08:19.573185353 +0000 UTC m=+0.104988732 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:08:19 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:08:19 localhost podman[242967]: 2025-12-06 10:08:19.646612759 +0000 UTC m=+0.174813237 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:08:19 localhost podman[242967]: 2025-12-06 10:08:19.657526302 +0000 UTC m=+0.185726760 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:08:19 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:08:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:22.988 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:08:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:22.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.006 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.006 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14b5f734-7f12-4e06-b57c-54eccec51d39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:22.988981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f82f88c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.199055608, 'message_signature': '80db10943867e0f6703e32ef23eae01cafc68aae02441453d750da594908f821'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:22.988981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f8301d8-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.199055608, 'message_signature': '64236817570c78da247a432a93ea92545f54adcd02f52dd670ee942384acac24'}]}, 'timestamp': '2025-12-06 10:08:23.006567', '_unique_id': '194997f7a32e4df9b38bb65e9fee267e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.007 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.011 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ca2b668-9ce3-42c1-bf30-8d9509058be0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.008266', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f83c622-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': '54a1b47e9de83c671cb727e6f20c501c93a5ef2177381bef4a7dfcada881de8b'}]}, 'timestamp': '2025-12-06 10:08:23.011597', '_unique_id': '05fb34348e434137b370a47b34b3bf8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.012 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c63ca4b-b382-4c21-929f-f5949991c9c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.012631', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f83f732-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': '987fcd39e478c4bec7f3747435a261de3e69f8db1b7582c0b0bb2bac594beae3'}]}, 'timestamp': '2025-12-06 10:08:23.012865', '_unique_id': '125b98421ef54655bec001e8d83784ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.013 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54657da4-cc5e-481d-bfb9-2924fd36d105', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.013890', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f8428ba-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': '97ca4707ca4e48328a14fcee0087b28bf1c4abbd15db528712f22dfe4ba9bcab'}]}, 'timestamp': '2025-12-06 10:08:23.014120', '_unique_id': 'f69e9240f3ec4fdd8925bb18df707936'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.014 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.048 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.048 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb697c58-7a4a-4b3e-878c-065a6b3b95f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.015108', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f896b72-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': 'e25b3aedf9a301c97cd056e439348a8c319d128603f48fc59cc5214e2f56bc1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.015108', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f897efa-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': 'f030ce331294143892c075a4f88e20c9f40a5ae0a4f8c8af0bb4854e8f9bb4da'}]}, 'timestamp': '2025-12-06 10:08:23.049198', '_unique_id': '237c8549931740a5b32aa205c736ee02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.050 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.051 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.052 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d4421d8-4320-48bf-bc21-60d239fe2660', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.051673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f89f222-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.199055608, 'message_signature': 'a53316c0d77553330ba5abea3f5726e1d83b9b05f789de3af9352777cb08ebfe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.051673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f8a02a8-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.199055608, 'message_signature': '2f0a377af9e0fe5a61c728787cdf4049b44188a533cbb8371c2d35543f18f30c'}]}, 'timestamp': '2025-12-06 10:08:23.052566', '_unique_id': 'e609651ff58e4f2384c825707468bfbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.054 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.055 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f2d3aa8-6ffb-4d7f-bc94-bf5f4b5bed12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.054789', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f8a6c48-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': '5a341fd66afa5850fd9a4d063cf83f1406596aef08603af58efa3d64a704b2a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.054789', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f8a7cce-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': '0d485330789592a28c159bcc272d439d64296c5cc16df4df731344bc16393dec'}]}, 'timestamp': '2025-12-06 10:08:23.055688', '_unique_id': 'd09c1fb340264b068a73c9090c11d4e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.057 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.074 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86f69ce6-b976-4655-931c-fdd2cfe36ace', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:08:23.057918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7f8d5e8a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.28409965, 'message_signature': '9eb0a9c81a038c722162749f0d04049a5892bc204fbe18e07587015c00510f13'}]}, 'timestamp': '2025-12-06 10:08:23.074604', '_unique_id': 'c62607f89a3e47e9b6122587e1240674'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f727e618-a4f2-4282-9498-ec727d6c83dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.076865', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f8dca28-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': 'd5497fe73fc38002d899f8a05cb88f70e5d81aab20b850ba37eb576b09de5da5'}]}, 'timestamp': '2025-12-06 10:08:23.077364', '_unique_id': 'f0ff65036a7e4d0d9565d0f05d35415b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01cb7429-b89a-4971-b872-65a17638bd9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.079719', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f8e3a1c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': 'a66e37e5e9a067525237973f125028d6e72a35bb7e1a09714a41dc2051f55494'}]}, 'timestamp': '2025-12-06 10:08:23.080229', '_unique_id': 'e9d938cae2aa46b89a60f8a53d715873'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.082 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a5978ee-80ac-44d7-ab48-4f6953a99ef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.082395', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f8ea13c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': 'effe08465e6696ebb7b46ccb1c7fdd05f6b3708e71fd0f3756938b50b1600241'}]}, 'timestamp': '2025-12-06 10:08:23.082896', '_unique_id': 'd0b581858ed24263991bc2315f3538ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d6a5ec9-23cb-4308-8700-66da2827d242', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.085185', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f8f0e24-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': '4c74b6faa7f8b0d1d050e34dd0a580f8b14667c0971f7c771d9f483450bcba1b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.085185', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f8f208a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': 'c09793ef840f9cac37bd2630a6d076fea004e5a076cc5bb9dbb218de287dec56'}]}, 'timestamp': '2025-12-06 10:08:23.086098', '_unique_id': '6375fd76a6a942c2aaed47a00365b1d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.088 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09eedca3-0206-4f94-9ee5-605efe7567ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.088524', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f8f929a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': 'b5763190ba21ea0f9fab1c337371ebab8096830bbb0ff059f514ff3fad7935b2'}]}, 'timestamp': '2025-12-06 10:08:23.089078', '_unique_id': 'b2d80dad05a84f86920821bbc692f6d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c67c82db-d155-44bd-ad32-57482451013e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.091294', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f8ffcee-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': 'e58e786f420bfe6ca41aa38da62e9d2c8fdde68491e8d30e55c066affca8bfbf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.091294', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f900e46-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': '93dd4105e2c33cf03ee1037e6408dfad40fb9d0ec874d2ea24f6878d79460611'}]}, 'timestamp': '2025-12-06 10:08:23.092182', '_unique_id': '742f2e1a72794916873f23f309a8e005'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.094 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.094 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 14810000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd08e634-965b-484c-803b-19e779fe2f9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14810000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:08:23.094358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7f90746c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.28409965, 'message_signature': 'bbd526695431f17bae9dc8f8610bb371fe46bf7b3bb2e57194f0b1094cea4bdc'}]}, 'timestamp': '2025-12-06 10:08:23.094807', '_unique_id': '92cffae0d55c4f01a8d51119d420b3f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '548f7cde-0685-400f-8227-2c9d87e22bef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.096986', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f90dbfa-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': '295e8bfe5d8548b352a66400a4f0fa73e01c693d152614ab608dc6edaaa099c4'}]}, 'timestamp': '2025-12-06 10:08:23.097538', '_unique_id': 'cbd1165e5c284212a3da38e4edae0e82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.099 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost nova_compute[237281]: 2025-12-06 10:08:23.100 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49dcc930-6ac0-41b6-b47e-86941895b438', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.099785', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f9149dc-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': 'd09fa4009fec36f4554fb4751d4a9fe8c03c812c59ca5a3182dfc8d03df3e8f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.099785', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f915a58-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': '3f0f30ee0e85194790cd1d20de8558e15fd9a509efd4e7db8f811325b2e0ce8c'}]}, 'timestamp': '2025-12-06 10:08:23.100682', '_unique_id': '5776ee95d27248a7b73dfd492d6320c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0674d174-8c28-45cb-83d5-7256cb0820f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.102898', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f91c240-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': 'cc625c3156ce38294962f69f8feb879d93e5dbdf9111477755f2ac6a98778055'}]}, 'timestamp': '2025-12-06 10:08:23.103370', '_unique_id': '5f4a779b647f44fe905bb366883f16a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ae3cf6-c1bb-4449-a73a-d987b974271b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:08:23.105956', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7f923a7c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.218339478, 'message_signature': 'a30ae3634c9c647edf925bd84791652181f98ef090fd464c04ade6188989fbcc'}]}, 'timestamp': '2025-12-06 10:08:23.106456', '_unique_id': 'fc3e41149c144b64b5091d0b0009e325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.108 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '886ffbfc-67d3-4bd6-b6c1-cf8e7ae06cf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.108898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f92ad0e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': '4e271aaea792dbb1359385ef15bcad82b3f643ab62cfc4b3cb347434ccd30b1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.108898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f92bf24-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.225180198, 'message_signature': 'dcf65cb946651165f56b717c4229854c62482a240309602512c3554730308e5f'}]}, 'timestamp': '2025-12-06 10:08:23.109823', '_unique_id': '392ed7f456e941de95d3a44d2b433cc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.111 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.111 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89958550-e7e1-49c0-932d-c1e3a5298323', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:08:23.111441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7f930c68-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.199055608, 'message_signature': 'f3889155ac0d02dbf44d74b4caa0657315c9a4ee2fe765c5554a9b755fe8d6a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:08:23.111441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7f9317f8-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12090.199055608, 'message_signature': '246b0468532d6c596c763c4492cd15b8b321e180382d4ed193239ed84043931e'}]}, 'timestamp': '2025-12-06 10:08:23.112014', '_unique_id': '790b7602d95e4626be88b052700b35bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:08:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:08:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:08:23 localhost podman[197801]: time="2025-12-06T10:08:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:08:23 localhost podman[197801]: @ - - [06/Dec/2025:10:08:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:08:23 localhost podman[197801]: @ - - [06/Dec/2025:10:08:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15460 "" "Go-http-client/1.1" Dec 6 05:08:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28417 DF PROTO=TCP SPT=56126 DPT=9102 SEQ=4173745252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD091C90000000001030307) Dec 6 05:08:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28418 DF PROTO=TCP SPT=56126 DPT=9102 SEQ=4173745252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD095C70000000001030307) Dec 6 05:08:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40331 DF PROTO=TCP SPT=57424 DPT=9102 SEQ=1961414207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD099870000000001030307) Dec 6 05:08:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28419 DF PROTO=TCP SPT=56126 DPT=9102 SEQ=4173745252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD09DC70000000001030307) Dec 6 05:08:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:08:27 localhost podman[243003]: 2025-12-06 10:08:27.546537773 +0000 UTC m=+0.081091451 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, architecture=x86_64, release=1755695350, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6) Dec 6 05:08:27 localhost podman[243003]: 2025-12-06 10:08:27.563278555 +0000 UTC m=+0.097832273 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter) Dec 6 05:08:27 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:08:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53197 DF PROTO=TCP SPT=56410 DPT=9102 SEQ=2937584635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD0A1870000000001030307) Dec 6 05:08:28 localhost nova_compute[237281]: 2025-12-06 10:08:28.103 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:08:29 localhost systemd[1]: tmp-crun.fcpaM3.mount: Deactivated successfully. Dec 6 05:08:29 localhost podman[243024]: 2025-12-06 10:08:29.551195151 +0000 UTC m=+0.084102623 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:08:29 localhost podman[243024]: 2025-12-06 10:08:29.561414743 +0000 UTC m=+0.094322215 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:08:29 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:08:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28420 DF PROTO=TCP SPT=56126 DPT=9102 SEQ=4173745252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD0AD880000000001030307) Dec 6 05:08:33 localhost nova_compute[237281]: 2025-12-06 10:08:33.104 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:08:33 localhost nova_compute[237281]: 2025-12-06 10:08:33.106 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:33 localhost nova_compute[237281]: 2025-12-06 10:08:33.107 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:08:33 localhost nova_compute[237281]: 2025-12-06 10:08:33.107 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:08:33 localhost nova_compute[237281]: 2025-12-06 10:08:33.107 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:08:33 localhost nova_compute[237281]: 2025-12-06 10:08:33.108 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:36 localhost nova_compute[237281]: 2025-12-06 10:08:36.888 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:36 localhost nova_compute[237281]: 2025-12-06 10:08:36.889 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:37 localhost nova_compute[237281]: 2025-12-06 10:08:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:37 localhost nova_compute[237281]: 2025-12-06 10:08:37.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:08:37 localhost nova_compute[237281]: 2025-12-06 10:08:37.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:08:38 localhost nova_compute[237281]: 2025-12-06 10:08:38.109 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:38 localhost nova_compute[237281]: 2025-12-06 10:08:38.571 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:08:38 localhost nova_compute[237281]: 2025-12-06 10:08:38.572 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:08:38 localhost nova_compute[237281]: 2025-12-06 10:08:38.572 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:08:38 localhost nova_compute[237281]: 2025-12-06 10:08:38.573 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:08:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28421 DF PROTO=TCP SPT=56126 DPT=9102 SEQ=4173745252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD0CD880000000001030307) Dec 6 05:08:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:08:39 localhost podman[243047]: 2025-12-06 10:08:39.536705651 +0000 UTC m=+0.073955982 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:08:39 localhost podman[243047]: 2025-12-06 10:08:39.596326635 +0000 UTC m=+0.133576966 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:08:39 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:08:40 localhost nova_compute[237281]: 2025-12-06 10:08:40.620 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:08:40 localhost nova_compute[237281]: 2025-12-06 10:08:40.640 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:08:40 localhost nova_compute[237281]: 2025-12-06 10:08:40.641 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:08:40 localhost nova_compute[237281]: 2025-12-06 10:08:40.642 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:40 localhost nova_compute[237281]: 2025-12-06 10:08:40.642 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:08:40 localhost nova_compute[237281]: 2025-12-06 10:08:40.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:40 localhost nova_compute[237281]: 2025-12-06 10:08:40.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:08:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:08:42 localhost podman[243073]: 2025-12-06 10:08:42.55007833 +0000 UTC m=+0.077573054 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:08:42 localhost podman[243073]: 2025-12-06 10:08:42.559331742 +0000 UTC m=+0.086826476 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:08:42 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:08:42 localhost podman[243074]: 2025-12-06 10:08:42.615209141 +0000 UTC m=+0.137277809 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Dec 6 05:08:42 localhost podman[243074]: 2025-12-06 10:08:42.627306311 +0000 UTC m=+0.149374999 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3) Dec 6 05:08:42 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:08:42 localhost nova_compute[237281]: 2025-12-06 10:08:42.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:42 localhost nova_compute[237281]: 2025-12-06 10:08:42.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:43 localhost nova_compute[237281]: 2025-12-06 10:08:43.112 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:44 localhost nova_compute[237281]: 2025-12-06 10:08:44.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:08:44 localhost nova_compute[237281]: 2025-12-06 10:08:44.910 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:08:44 localhost nova_compute[237281]: 2025-12-06 10:08:44.911 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:08:44 localhost nova_compute[237281]: 2025-12-06 10:08:44.911 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:08:44 localhost nova_compute[237281]: 2025-12-06 10:08:44.911 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:08:44 localhost nova_compute[237281]: 2025-12-06 10:08:44.973 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.048 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.049 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.125 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.126 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.181 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.183 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.262 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.435 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.436 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12669MB free_disk=387.30888748168945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.436 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.436 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.533 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.534 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.534 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.593 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.615 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.618 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:08:45 localhost nova_compute[237281]: 2025-12-06 10:08:45.618 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:08:46 localhost openstack_network_exporter[199751]: ERROR 10:08:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:08:46 localhost openstack_network_exporter[199751]: ERROR 10:08:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:08:46 localhost openstack_network_exporter[199751]: ERROR 10:08:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:08:46 localhost openstack_network_exporter[199751]: Dec 6 05:08:46 localhost openstack_network_exporter[199751]: ERROR 10:08:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:08:46 localhost openstack_network_exporter[199751]: ERROR 10:08:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:08:46 localhost openstack_network_exporter[199751]: Dec 6 05:08:48 localhost nova_compute[237281]: 2025-12-06 10:08:48.114 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:08:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:08:50 localhost podman[243126]: 2025-12-06 10:08:50.546381332 +0000 UTC m=+0.081137983 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Dec 6 05:08:50 localhost podman[243127]: 2025-12-06 10:08:50.61139263 +0000 UTC m=+0.142589681 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125) Dec 6 05:08:50 localhost podman[243126]: 2025-12-06 10:08:50.62574091 +0000 UTC m=+0.160497531 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:08:50 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:08:50 localhost podman[243127]: 2025-12-06 10:08:50.675950765 +0000 UTC m=+0.207147846 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Dec 6 05:08:50 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:08:53 localhost sshd[243162]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:08:53 localhost nova_compute[237281]: 2025-12-06 10:08:53.119 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:08:53 localhost nova_compute[237281]: 2025-12-06 10:08:53.121 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:08:53 localhost nova_compute[237281]: 2025-12-06 10:08:53.121 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:08:53 localhost nova_compute[237281]: 2025-12-06 10:08:53.122 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:08:53 localhost nova_compute[237281]: 2025-12-06 10:08:53.144 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:53 localhost nova_compute[237281]: 2025-12-06 10:08:53.145 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:08:53 localhost podman[197801]: time="2025-12-06T10:08:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:08:53 localhost podman[197801]: @ - - [06/Dec/2025:10:08:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:08:53 localhost podman[197801]: @ - - [06/Dec/2025:10:08:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15472 "" "Go-http-client/1.1" Dec 6 05:08:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60110 DF PROTO=TCP SPT=46414 DPT=9102 SEQ=2508753307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD106F90000000001030307) Dec 6 05:08:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60111 DF PROTO=TCP SPT=46414 DPT=9102 SEQ=2508753307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD10B070000000001030307) Dec 6 05:08:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28422 DF PROTO=TCP SPT=56126 DPT=9102 SEQ=4173745252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD10D870000000001030307) Dec 6 05:08:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60112 DF PROTO=TCP SPT=46414 DPT=9102 SEQ=2508753307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD113070000000001030307) Dec 6 05:08:58 localhost nova_compute[237281]: 2025-12-06 10:08:58.145 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:58 localhost nova_compute[237281]: 2025-12-06 10:08:58.148 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:08:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40332 DF PROTO=TCP SPT=57424 DPT=9102 SEQ=1961414207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD117870000000001030307) Dec 6 05:08:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:08:58 localhost podman[243164]: 2025-12-06 10:08:58.547841282 +0000 UTC m=+0.082084162 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_id=edpm, build-date=2025-08-20T13:12:41, architecture=x86_64, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Dec 6 05:08:58 localhost podman[243164]: 2025-12-06 10:08:58.562221962 +0000 UTC m=+0.096464852 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 05:08:58 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:09:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:09:00 localhost podman[243185]: 2025-12-06 10:09:00.546577269 +0000 UTC m=+0.081666339 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:09:00 localhost podman[243185]: 2025-12-06 10:09:00.556206123 +0000 UTC m=+0.091295183 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:09:00 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:09:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60113 DF PROTO=TCP SPT=46414 DPT=9102 SEQ=2508753307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD122C80000000001030307) Dec 6 05:09:03 localhost nova_compute[237281]: 2025-12-06 10:09:03.149 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:03 localhost nova_compute[237281]: 2025-12-06 10:09:03.151 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:03 localhost nova_compute[237281]: 2025-12-06 10:09:03.151 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:09:03 localhost nova_compute[237281]: 2025-12-06 10:09:03.152 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:03 localhost nova_compute[237281]: 2025-12-06 10:09:03.164 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:03 localhost nova_compute[237281]: 2025-12-06 10:09:03.165 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:09:06.691 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:09:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:09:06.692 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:09:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:09:06.693 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:09:08 localhost nova_compute[237281]: 2025-12-06 10:09:08.165 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:08 localhost nova_compute[237281]: 2025-12-06 10:09:08.167 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:08 localhost nova_compute[237281]: 2025-12-06 10:09:08.167 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:09:08 localhost nova_compute[237281]: 2025-12-06 10:09:08.168 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:08 localhost nova_compute[237281]: 2025-12-06 10:09:08.168 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:08 localhost nova_compute[237281]: 2025-12-06 10:09:08.170 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60114 DF PROTO=TCP SPT=46414 DPT=9102 SEQ=2508753307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD143870000000001030307) Dec 6 05:09:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:09:10 localhost podman[243209]: 2025-12-06 10:09:10.554715338 +0000 UTC m=+0.082571793 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:09:10 localhost podman[243209]: 2025-12-06 10:09:10.628372494 +0000 UTC m=+0.156228929 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:09:10 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:09:13 localhost nova_compute[237281]: 2025-12-06 10:09:13.217 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:13 localhost nova_compute[237281]: 2025-12-06 10:09:13.219 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:13 localhost nova_compute[237281]: 2025-12-06 10:09:13.219 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5048 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:09:13 localhost nova_compute[237281]: 2025-12-06 10:09:13.219 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:13 localhost nova_compute[237281]: 2025-12-06 10:09:13.221 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:13 localhost nova_compute[237281]: 2025-12-06 10:09:13.226 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:09:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:09:13 localhost podman[243234]: 2025-12-06 10:09:13.526052981 +0000 UTC m=+0.060162393 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Dec 6 05:09:13 localhost podman[243234]: 2025-12-06 10:09:13.537447211 +0000 UTC m=+0.071556623 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:09:13 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:09:13 localhost podman[243233]: 2025-12-06 10:09:13.576144632 +0000 UTC m=+0.113326888 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:09:13 localhost podman[243233]: 2025-12-06 10:09:13.607477937 +0000 UTC m=+0.144660213 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:09:13 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:09:16 localhost openstack_network_exporter[199751]: ERROR 10:09:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:09:16 localhost openstack_network_exporter[199751]: ERROR 10:09:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:09:16 localhost openstack_network_exporter[199751]: ERROR 10:09:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:09:16 localhost openstack_network_exporter[199751]: ERROR 10:09:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:09:16 localhost openstack_network_exporter[199751]: Dec 6 05:09:16 localhost openstack_network_exporter[199751]: ERROR 10:09:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:09:16 localhost openstack_network_exporter[199751]: Dec 6 05:09:18 localhost nova_compute[237281]: 2025-12-06 10:09:18.221 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:18 localhost nova_compute[237281]: 2025-12-06 10:09:18.226 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:09:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:09:21 localhost podman[243275]: 2025-12-06 10:09:21.549283193 +0000 UTC m=+0.080810278 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:09:21 localhost podman[243275]: 2025-12-06 10:09:21.553977388 +0000 UTC m=+0.085504533 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:09:21 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:09:21 localhost podman[243276]: 2025-12-06 10:09:21.612600643 +0000 UTC m=+0.138450803 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Dec 6 05:09:21 localhost podman[243276]: 2025-12-06 10:09:21.652254323 +0000 UTC m=+0.178104443 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125) Dec 6 05:09:21 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:09:23 localhost nova_compute[237281]: 2025-12-06 10:09:23.227 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:23 localhost nova_compute[237281]: 2025-12-06 10:09:23.229 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:23 localhost nova_compute[237281]: 2025-12-06 10:09:23.229 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:09:23 localhost nova_compute[237281]: 2025-12-06 10:09:23.229 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:23 localhost nova_compute[237281]: 2025-12-06 10:09:23.249 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:23 localhost nova_compute[237281]: 2025-12-06 10:09:23.249 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:23 localhost podman[197801]: time="2025-12-06T10:09:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:09:23 localhost podman[197801]: @ - - [06/Dec/2025:10:09:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:09:23 localhost podman[197801]: @ - - [06/Dec/2025:10:09:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15477 "" "Go-http-client/1.1" Dec 6 05:09:23 localhost sshd[243313]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:09:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50216 DF PROTO=TCP SPT=40124 DPT=9102 SEQ=2586959832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD17C2A0000000001030307) Dec 6 05:09:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50217 DF PROTO=TCP SPT=40124 DPT=9102 SEQ=2586959832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD180480000000001030307) Dec 6 05:09:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60115 DF PROTO=TCP SPT=46414 DPT=9102 SEQ=2508753307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD183870000000001030307) Dec 6 05:09:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50218 DF PROTO=TCP SPT=40124 DPT=9102 SEQ=2586959832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD188480000000001030307) Dec 6 05:09:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28423 DF PROTO=TCP SPT=56126 DPT=9102 SEQ=4173745252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD18B880000000001030307) Dec 6 05:09:28 localhost nova_compute[237281]: 2025-12-06 10:09:28.249 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:09:29 localhost podman[243315]: 2025-12-06 10:09:29.535652843 +0000 UTC m=+0.070532722 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64) Dec 6 05:09:29 localhost podman[243315]: 2025-12-06 10:09:29.545706323 +0000 UTC m=+0.080586232 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git) Dec 6 05:09:29 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:09:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50219 DF PROTO=TCP SPT=40124 DPT=9102 SEQ=2586959832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD198070000000001030307) Dec 6 05:09:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:09:31 localhost podman[243335]: 2025-12-06 10:09:31.540137428 +0000 UTC m=+0.074968068 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:09:31 localhost podman[243335]: 2025-12-06 10:09:31.570552414 +0000 UTC m=+0.105383034 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:09:31 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:09:33 localhost nova_compute[237281]: 2025-12-06 10:09:33.251 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:33 localhost nova_compute[237281]: 2025-12-06 10:09:33.253 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:33 localhost nova_compute[237281]: 2025-12-06 10:09:33.253 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:09:33 localhost nova_compute[237281]: 2025-12-06 10:09:33.253 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:33 localhost nova_compute[237281]: 2025-12-06 10:09:33.296 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:33 localhost nova_compute[237281]: 2025-12-06 10:09:33.297 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:37 localhost nova_compute[237281]: 2025-12-06 10:09:37.619 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:37 localhost nova_compute[237281]: 2025-12-06 10:09:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:37 localhost nova_compute[237281]: 2025-12-06 10:09:37.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:37 localhost nova_compute[237281]: 2025-12-06 10:09:37.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:09:38 localhost nova_compute[237281]: 2025-12-06 10:09:38.296 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50220 DF PROTO=TCP SPT=40124 DPT=9102 SEQ=2586959832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD1B7880000000001030307) Dec 6 05:09:39 localhost nova_compute[237281]: 2025-12-06 10:09:39.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:39 localhost nova_compute[237281]: 2025-12-06 10:09:39.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:09:39 localhost nova_compute[237281]: 2025-12-06 10:09:39.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:09:40 localhost nova_compute[237281]: 2025-12-06 10:09:40.585 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:09:40 localhost nova_compute[237281]: 2025-12-06 10:09:40.586 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:09:40 localhost nova_compute[237281]: 2025-12-06 10:09:40.586 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:09:40 localhost nova_compute[237281]: 2025-12-06 10:09:40.586 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:09:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:09:41 localhost podman[243357]: 2025-12-06 10:09:41.531305043 +0000 UTC m=+0.067718065 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:09:41 localhost podman[243357]: 2025-12-06 10:09:41.573427839 +0000 UTC m=+0.109840821 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:09:41 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:09:42 localhost nova_compute[237281]: 2025-12-06 10:09:42.043 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:09:42 localhost nova_compute[237281]: 2025-12-06 10:09:42.059 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:09:42 localhost nova_compute[237281]: 2025-12-06 10:09:42.059 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:09:42 localhost nova_compute[237281]: 2025-12-06 10:09:42.060 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:42 localhost nova_compute[237281]: 2025-12-06 10:09:42.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:42 localhost nova_compute[237281]: 2025-12-06 10:09:42.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:42 localhost nova_compute[237281]: 2025-12-06 10:09:42.922 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:43 localhost nova_compute[237281]: 2025-12-06 10:09:43.301 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:43 localhost nova_compute[237281]: 2025-12-06 10:09:43.303 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:09:44 localhost podman[243383]: 2025-12-06 10:09:44.55216077 +0000 UTC m=+0.081187250 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:09:44 localhost podman[243383]: 2025-12-06 10:09:44.565210722 +0000 UTC m=+0.094237152 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd) Dec 6 05:09:44 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:09:44 localhost podman[243382]: 2025-12-06 10:09:44.653903112 +0000 UTC m=+0.184400297 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:09:44 localhost podman[243382]: 2025-12-06 10:09:44.660100873 +0000 UTC m=+0.190598078 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:09:44 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:09:44 localhost nova_compute[237281]: 2025-12-06 10:09:44.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:45 localhost nova_compute[237281]: 2025-12-06 10:09:45.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:09:45 localhost nova_compute[237281]: 2025-12-06 10:09:45.915 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:09:45 localhost nova_compute[237281]: 2025-12-06 10:09:45.915 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:09:45 localhost nova_compute[237281]: 2025-12-06 10:09:45.916 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:09:45 localhost nova_compute[237281]: 2025-12-06 10:09:45.916 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:09:45 localhost nova_compute[237281]: 2025-12-06 10:09:45.984 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.057 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.059 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.113 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.114 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.163 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.164 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:09:46 localhost openstack_network_exporter[199751]: ERROR 10:09:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:09:46 localhost openstack_network_exporter[199751]: ERROR 10:09:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:09:46 localhost openstack_network_exporter[199751]: ERROR 10:09:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:09:46 localhost openstack_network_exporter[199751]: ERROR 10:09:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:09:46 localhost openstack_network_exporter[199751]: Dec 6 05:09:46 localhost openstack_network_exporter[199751]: ERROR 10:09:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:09:46 localhost openstack_network_exporter[199751]: Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.210 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.434 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.436 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12658MB free_disk=387.30888748168945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.437 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.437 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.533 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.533 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.534 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.571 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.592 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.593 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:09:46 localhost nova_compute[237281]: 2025-12-06 10:09:46.593 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:09:48 localhost nova_compute[237281]: 2025-12-06 10:09:48.304 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:48 localhost nova_compute[237281]: 2025-12-06 10:09:48.306 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:09:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:09:52 localhost podman[243436]: 2025-12-06 10:09:52.563713454 +0000 UTC m=+0.089686271 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125) Dec 6 05:09:52 localhost podman[243436]: 2025-12-06 10:09:52.574238598 +0000 UTC m=+0.100211475 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:09:52 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:09:52 localhost podman[243437]: 2025-12-06 10:09:52.623082481 +0000 UTC m=+0.143072284 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 6 05:09:52 localhost podman[243437]: 2025-12-06 10:09:52.637284699 +0000 UTC m=+0.157274562 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:09:52 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:09:53 localhost podman[197801]: time="2025-12-06T10:09:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:09:53 localhost nova_compute[237281]: 2025-12-06 10:09:53.308 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:53 localhost nova_compute[237281]: 2025-12-06 10:09:53.310 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:53 localhost nova_compute[237281]: 2025-12-06 10:09:53.311 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:09:53 localhost nova_compute[237281]: 2025-12-06 10:09:53.311 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:53 localhost podman[197801]: @ - - [06/Dec/2025:10:09:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:09:53 localhost nova_compute[237281]: 2025-12-06 10:09:53.347 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:53 localhost nova_compute[237281]: 2025-12-06 10:09:53.348 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:53 localhost podman[197801]: @ - - [06/Dec/2025:10:09:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15471 "" "Go-http-client/1.1" Dec 6 05:09:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19227 DF PROTO=TCP SPT=36172 DPT=9102 SEQ=1718763582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD1F1590000000001030307) Dec 6 05:09:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19228 DF PROTO=TCP SPT=36172 DPT=9102 SEQ=1718763582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD1F5470000000001030307) Dec 6 05:09:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50221 DF PROTO=TCP SPT=40124 DPT=9102 SEQ=2586959832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD1F7870000000001030307) Dec 6 05:09:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19229 DF PROTO=TCP SPT=36172 DPT=9102 SEQ=1718763582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD1FD480000000001030307) Dec 6 05:09:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60116 DF PROTO=TCP SPT=46414 DPT=9102 SEQ=2508753307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD201880000000001030307) Dec 6 05:09:58 localhost nova_compute[237281]: 2025-12-06 10:09:58.349 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:09:58 localhost nova_compute[237281]: 2025-12-06 10:09:58.350 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:09:58 localhost nova_compute[237281]: 2025-12-06 10:09:58.351 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:09:58 localhost nova_compute[237281]: 2025-12-06 10:09:58.351 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:58 localhost nova_compute[237281]: 2025-12-06 10:09:58.352 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:09:58 localhost nova_compute[237281]: 2025-12-06 10:09:58.355 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:10:00 localhost podman[243473]: 2025-12-06 10:10:00.543680216 +0000 UTC m=+0.080721386 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64) Dec 6 05:10:00 localhost podman[243473]: 2025-12-06 10:10:00.557800481 +0000 UTC m=+0.094841621 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 05:10:00 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:10:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19230 DF PROTO=TCP SPT=36172 DPT=9102 SEQ=1718763582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD20D080000000001030307) Dec 6 05:10:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:10:02 localhost podman[243493]: 2025-12-06 10:10:02.545692416 +0000 UTC m=+0.079268542 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:10:02 localhost podman[243493]: 2025-12-06 10:10:02.578545906 +0000 UTC m=+0.112122032 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:10:02 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:10:03 localhost nova_compute[237281]: 2025-12-06 10:10:03.357 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:03 localhost nova_compute[237281]: 2025-12-06 10:10:03.359 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:03 localhost nova_compute[237281]: 2025-12-06 10:10:03.359 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:10:03 localhost nova_compute[237281]: 2025-12-06 10:10:03.360 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:03 localhost nova_compute[237281]: 2025-12-06 10:10:03.382 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:03 localhost nova_compute[237281]: 2025-12-06 10:10:03.382 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:10:06.693 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:10:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:10:06.694 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:10:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:10:06.695 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:10:08 localhost nova_compute[237281]: 2025-12-06 10:10:08.384 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19231 DF PROTO=TCP SPT=36172 DPT=9102 SEQ=1718763582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD22D870000000001030307) Dec 6 05:10:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:10:12 localhost systemd[1]: tmp-crun.RoKknP.mount: Deactivated successfully. Dec 6 05:10:12 localhost podman[243515]: 2025-12-06 10:10:12.553585894 +0000 UTC m=+0.086276427 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125) Dec 6 05:10:12 localhost podman[243515]: 2025-12-06 10:10:12.590475349 +0000 UTC m=+0.123165852 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:10:12 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:10:13 localhost nova_compute[237281]: 2025-12-06 10:10:13.386 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:13 localhost nova_compute[237281]: 2025-12-06 10:10:13.427 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:13 localhost nova_compute[237281]: 2025-12-06 10:10:13.428 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:10:13 localhost nova_compute[237281]: 2025-12-06 10:10:13.428 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:13 localhost nova_compute[237281]: 2025-12-06 10:10:13.429 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:13 localhost nova_compute[237281]: 2025-12-06 10:10:13.431 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:10:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:10:15 localhost podman[243541]: 2025-12-06 10:10:15.550047962 +0000 UTC m=+0.081592734 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:10:15 localhost podman[243541]: 2025-12-06 10:10:15.556096528 +0000 UTC m=+0.087641300 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:10:15 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:10:15 localhost podman[243542]: 2025-12-06 10:10:15.608311444 +0000 UTC m=+0.135670376 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:10:15 localhost podman[243542]: 2025-12-06 10:10:15.6208355 +0000 UTC m=+0.148194432 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125) Dec 6 05:10:15 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:10:16 localhost openstack_network_exporter[199751]: ERROR 10:10:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:10:16 localhost openstack_network_exporter[199751]: ERROR 10:10:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:10:16 localhost openstack_network_exporter[199751]: ERROR 10:10:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:10:16 localhost openstack_network_exporter[199751]: ERROR 10:10:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:10:16 localhost openstack_network_exporter[199751]: Dec 6 05:10:16 localhost openstack_network_exporter[199751]: ERROR 10:10:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:10:16 localhost openstack_network_exporter[199751]: Dec 6 05:10:18 localhost nova_compute[237281]: 2025-12-06 10:10:18.435 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:22.991 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:10:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:22.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.037 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.038 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66004bc4-e260-4303-bfa5-faa3660ddd95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:22.992735', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c70e6998-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': '6e67f8aa87432943bdeef1ef1b6719fd81a9e135483d66720b1ea9aabc48b474'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:22.992735', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c70e84fa-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'a2dacc03e1a8d1586dd1f03d0ad732838db8e2a9017324d1649e68fb7a167875'}]}, 'timestamp': '2025-12-06 10:10:23.039291', '_unique_id': 'eb8f62966448404f889acbf9a8d2a433'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.041 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.042 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.043 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd659e99-c44b-4b0c-9f16-51397605bff5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.042495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c70f1ab4-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'a22323e00c25e43aa04ea777752dc462e6c116257471f590f41c34a050ab7a46'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.042495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c70f3530-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'e7e621b3da45b4b020c1ff0448b222a097046c807f931abcadda145ab2ef956a'}]}, 'timestamp': '2025-12-06 10:10:23.043831', '_unique_id': '16ea66517f124ac8b2d185f4fa775db3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.045 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.056 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1cb8eb5-d599-459f-bc69-f5a30308b197', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.047469', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c7114140-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '628eb670d687531901ce93b7fd992b5637990b1352a71df6f8a6911f38c1c63b'}]}, 'timestamp': '2025-12-06 10:10:23.057290', '_unique_id': '64c521a6480940368d2f22f1489d8351'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.058 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.059 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55dabe41-ecae-4f8c-bf1f-54e6ef6895fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.059739', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c711bf8a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '5a36ac331ae0a30393c0e189bdff1e4a7d98ff6e7ac38e99a1366b09f84a12e3'}]}, 'timestamp': '2025-12-06 10:10:23.060516', '_unique_id': '06fd59af08f34bb790977d9adc8b4837'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.084 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6bbcbed-5b24-4c32-8fd4-353a875d89b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.064051', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7155b54-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.274196145, 'message_signature': '378d3a53d573136c9ce1d7174fc774a64e0863ddb469e96e20621043da77db8b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.064051', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c71570bc-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.274196145, 'message_signature': '207bb977ccb1f776ed48b1349e4db0fc910ddae6e20bd04247a5b23fd3b6f37a'}]}, 'timestamp': '2025-12-06 10:10:23.084570', '_unique_id': 'fa058e0044b14389a82510afb6485d0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.085 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.087 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '168f3ff6-b749-49a5-b613-fdab4007b8fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.087100', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c715e538-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '4cb0a2ab655937344f54a59d71ce7c51ad4dd5565d82518c4ab7956d3c42dd0f'}]}, 'timestamp': '2025-12-06 10:10:23.087575', '_unique_id': 'eada4deff3954f5f98787112125f05b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.089 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd890498c-e9be-4b5b-8186-714a07824ba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.090162', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c7165c3e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '045dd9bc709af03fb8190f088bcb34ab4eac357845adc606f39934088b59f173'}]}, 'timestamp': '2025-12-06 10:10:23.090624', '_unique_id': '7568c3b0d4bd4852920f6973b7ae21fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.092 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 15410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49b08d8d-0835-42c6-94d6-34e942821989', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15410000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:10:23.092882', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c719f858-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.323599646, 'message_signature': '323ac02fe55ce6d622b6733454c627ed12c8cc0875f4ea84df98e757130cf3ac'}]}, 'timestamp': '2025-12-06 10:10:23.114353', '_unique_id': 'a403ae3a4be24b3d8b2d664619c5e17d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e10780d-57ed-47ce-96cb-453b5f7c4e27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.117052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c71a7814-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.274196145, 'message_signature': '176263fbf0eba843b21e9464d6c004bc8072f623f079b0608f5fd38b33280ff6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.117052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c71a8980-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.274196145, 'message_signature': '1508ea5b8352cb8641e74e95a5f15bdc2cb27cb52170d4069b6c4dc42ee91ec1'}]}, 'timestamp': '2025-12-06 10:10:23.117997', '_unique_id': '62449e55348a4d518d99a0a71e98ad5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf08a889-cedd-42c1-82ee-4ed0685b02fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.120188', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c71af1b8-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '44df9bdf515307853fcac4cdb80147e83273e5bf7cfccfc86c172859659c1cd7'}]}, 'timestamp': '2025-12-06 10:10:23.120662', '_unique_id': '0c75cca04e4947a09d6c666427336137'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd276a7c8-a13f-47b5-8cd1-89b6c987b37d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.122812', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c71b590a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '89bfa3de0a276c085a6bd7dd7ca0a1d7905ed9239461a0bacb1a673d14ce0eb4'}]}, 'timestamp': '2025-12-06 10:10:23.123306', '_unique_id': 'e1c5c78a65b54461991fbdb72e6437f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3125492-fe8a-490a-83c8-7bef11a8592c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.125447', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c71bbe90-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': '0fe5108a7a268a5946bba11e03cdd9adc4eb1d534d6a09f145027adeabb01a58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.125447', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c71bd038-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': '05863c0e57a53ff236d1f7f5303c17f453e80cca83ac16096eec81f741743be8'}]}, 'timestamp': '2025-12-06 10:10:23.126358', '_unique_id': '474ff4f678a244c8a5722915feddfe90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.128 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.129 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a73e496f-f7dc-4342-899c-1ffcc17e5719', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.128685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c71c3e1a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'be08df9cb8e33a2d200ffa9562328db9a291dfe4891771bb3e1332d839bfa117'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.128685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c71c4e64-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'db93d77ae09f1fed1141e790f6da6f4c2d3e3dfb2b1bd0dbdd05dac208e6dab9'}]}, 'timestamp': '2025-12-06 10:10:23.129556', '_unique_id': '8d9f84006716414489d58d88dc7e3e87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.131 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd874467b-73b4-44b3-9f6e-425c1b3d7608', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.131765', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c71cb73c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'ee93313a55a9ce8a8418a71655c651cc83eae3816ca9003957dd51f488c06775'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.131765', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c71cc772-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'dfe36da89cd167fb188b64251bbc42525bb15a4388c7058175437fd2febf9392'}]}, 'timestamp': '2025-12-06 10:10:23.132656', '_unique_id': '3bdc48d8b1c24309b8c084e5488f48ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2465bfea-b58b-4675-b9eb-c1f5c022f82e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.134812', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c71d2da2-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '78c5bdc9ae273db3b45a788efcdae33603b51547537859b035d5708a45029223'}]}, 'timestamp': '2025-12-06 10:10:23.135298', '_unique_id': 'd53669888fa64cf8b15349e2de050499'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.137 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.137 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.137 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '827bc9d8-7b05-420d-bd72-6e67c1eb2813', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.137693', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c71da1c4-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '8347131b218bd2bf215b3845d055aa9b503f1effa43f495f20355e99c9b287a8'}]}, 'timestamp': '2025-12-06 10:10:23.138371', '_unique_id': '46658467cef0448cb87434add341b13a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.141 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27b71a52-760a-4a8a-bf05-f9b3b1d245c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.141144', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c71e264e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '3ac37961ecd691f31f5714365d79cefe2ec362c19c7f15aa067e3c45aceb26f2'}]}, 'timestamp': '2025-12-06 10:10:23.141768', '_unique_id': 'cd57fda60fdd47ef82f2f12789168dd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.144 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.145 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '178a1f58-7bfd-4f3b-9999-3d7c6205a727', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.144479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c71ea6d2-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.274196145, 'message_signature': '085d78995010217fe3d1e3c9cdf02c13dd12dc28c9f0799cf483f36aeb7f8dc4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.144479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c71eba5a-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.274196145, 'message_signature': '57602224f9fe9d3de4d10bf91a5084eaccfe5f5a325e76f4ffcef77076bf84bb'}]}, 'timestamp': '2025-12-06 10:10:23.145506', '_unique_id': 'fffa7c641ff440fea37cf0244420d0e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.146 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.148 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.148 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38608615-a996-41bc-94e8-e6decc72bfc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:10:23.148434', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c71f3d72-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': '164c19706e2307c806e11d4860eb170f05dbe68952978bbb6c43a6705c0799fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:10:23.148434', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c71f477c-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.202875419, 'message_signature': 'ef36701da139dd857f76ff2649ff2e8cbefdcc62428299be9f3cb168b78585ca'}]}, 'timestamp': '2025-12-06 10:10:23.148989', '_unique_id': '0eeb1ff46d96486897a899549a003d9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.149 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.150 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.150 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '590eaa42-b929-4b22-bdee-1cd75096fd2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:10:23.150391', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c71f89c6-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.323599646, 'message_signature': '454e0960050f07df9a942d119f7afb79930b3a58a0ff8d585c5fa78c7b2aa662'}]}, 'timestamp': '2025-12-06 10:10:23.150670', '_unique_id': 'eb9402915b3248e1aae21567975091a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc229cd4-b2c6-4444-bab2-530b4694d6f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:10:23.152060', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'c71fcb3e-d28b-11f0-8fed-fa163edf398d', 'monotonic_time': 12210.257634765, 'message_signature': '3de643e64a85de5a0edebde07a288cdd4ce7fb46f04a5cf042f5a2d1407740ed'}]}, 'timestamp': '2025-12-06 10:10:23.152376', '_unique_id': 'c8ab04baefe34c24b6c98a154782f1a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:10:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:10:23.152 12 ERROR oslo_messaging.notify.messaging Dec 6 05:10:23 localhost podman[197801]: time="2025-12-06T10:10:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:10:23 localhost podman[197801]: @ - - [06/Dec/2025:10:10:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:10:23 localhost podman[197801]: @ - - [06/Dec/2025:10:10:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15462 "" "Go-http-client/1.1" Dec 6 05:10:23 localhost nova_compute[237281]: 2025-12-06 10:10:23.437 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:23 localhost nova_compute[237281]: 2025-12-06 10:10:23.439 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:23 localhost nova_compute[237281]: 2025-12-06 10:10:23.439 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:10:23 localhost nova_compute[237281]: 2025-12-06 10:10:23.439 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:23 localhost nova_compute[237281]: 2025-12-06 10:10:23.478 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:23 localhost nova_compute[237281]: 2025-12-06 10:10:23.478 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:10:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:10:23 localhost podman[243582]: 2025-12-06 10:10:23.599378839 +0000 UTC m=+0.092447816 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:10:23 localhost podman[243582]: 2025-12-06 10:10:23.636181342 +0000 UTC m=+0.129250369 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:10:23 localhost podman[243583]: 2025-12-06 10:10:23.654541937 +0000 UTC m=+0.147443699 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:10:23 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:10:23 localhost podman[243583]: 2025-12-06 10:10:23.665258586 +0000 UTC m=+0.158160368 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm) Dec 6 05:10:23 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:10:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37053 DF PROTO=TCP SPT=52104 DPT=9102 SEQ=3555361610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2668A0000000001030307) Dec 6 05:10:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37054 DF PROTO=TCP SPT=52104 DPT=9102 SEQ=3555361610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD26A870000000001030307) Dec 6 05:10:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19232 DF PROTO=TCP SPT=36172 DPT=9102 SEQ=1718763582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD26D880000000001030307) Dec 6 05:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37055 DF PROTO=TCP SPT=52104 DPT=9102 SEQ=3555361610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD272870000000001030307) Dec 6 05:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50222 DF PROTO=TCP SPT=40124 DPT=9102 SEQ=2586959832 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD275870000000001030307) Dec 6 05:10:28 localhost nova_compute[237281]: 2025-12-06 10:10:28.479 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:28 localhost nova_compute[237281]: 2025-12-06 10:10:28.480 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:28 localhost nova_compute[237281]: 2025-12-06 10:10:28.480 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:10:28 localhost nova_compute[237281]: 2025-12-06 10:10:28.481 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:28 localhost nova_compute[237281]: 2025-12-06 10:10:28.482 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:28 localhost nova_compute[237281]: 2025-12-06 10:10:28.484 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37056 DF PROTO=TCP SPT=52104 DPT=9102 SEQ=3555361610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD282480000000001030307) Dec 6 05:10:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:10:31 localhost podman[243617]: 2025-12-06 10:10:31.553021079 +0000 UTC m=+0.084258095 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 6 05:10:31 localhost podman[243617]: 2025-12-06 10:10:31.590361268 +0000 UTC m=+0.121598334 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, distribution-scope=public) Dec 6 05:10:31 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:10:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:10:33 localhost podman[243637]: 2025-12-06 10:10:33.251572958 +0000 UTC m=+0.088698050 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:10:33 localhost podman[243637]: 2025-12-06 10:10:33.283380578 +0000 UTC m=+0.120505610 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:10:33 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:10:33 localhost nova_compute[237281]: 2025-12-06 10:10:33.485 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:37 localhost nova_compute[237281]: 2025-12-06 10:10:37.596 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:37 localhost nova_compute[237281]: 2025-12-06 10:10:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:37 localhost nova_compute[237281]: 2025-12-06 10:10:37.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:10:38 localhost nova_compute[237281]: 2025-12-06 10:10:38.486 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37057 DF PROTO=TCP SPT=52104 DPT=9102 SEQ=3555361610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2A3880000000001030307) Dec 6 05:10:39 localhost nova_compute[237281]: 2025-12-06 10:10:39.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:40 localhost nova_compute[237281]: 2025-12-06 10:10:40.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:40 localhost nova_compute[237281]: 2025-12-06 10:10:40.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:10:40 localhost nova_compute[237281]: 2025-12-06 10:10:40.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:10:41 localhost nova_compute[237281]: 2025-12-06 10:10:41.553 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:10:41 localhost nova_compute[237281]: 2025-12-06 10:10:41.554 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:10:41 localhost nova_compute[237281]: 2025-12-06 10:10:41.554 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:10:41 localhost nova_compute[237281]: 2025-12-06 10:10:41.554 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.131 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.148 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.148 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.149 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.149 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.490 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.492 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.492 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.492 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.522 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:43 localhost nova_compute[237281]: 2025-12-06 10:10:43.523 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:43 localhost podman[243660]: 2025-12-06 10:10:43.54738572 +0000 UTC m=+0.079445827 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:10:43 localhost podman[243660]: 2025-12-06 10:10:43.585322647 +0000 UTC m=+0.117382704 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:10:43 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:10:44 localhost nova_compute[237281]: 2025-12-06 10:10:44.144 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:45 localhost nova_compute[237281]: 2025-12-06 10:10:45.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:46 localhost openstack_network_exporter[199751]: ERROR 10:10:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:10:46 localhost openstack_network_exporter[199751]: ERROR 10:10:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:10:46 localhost openstack_network_exporter[199751]: ERROR 10:10:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:10:46 localhost openstack_network_exporter[199751]: ERROR 10:10:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:10:46 localhost openstack_network_exporter[199751]: Dec 6 05:10:46 localhost openstack_network_exporter[199751]: ERROR 10:10:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:10:46 localhost openstack_network_exporter[199751]: Dec 6 05:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:10:46 localhost podman[243684]: 2025-12-06 10:10:46.55239922 +0000 UTC m=+0.081004675 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 05:10:46 localhost podman[243684]: 2025-12-06 10:10:46.594131024 +0000 UTC m=+0.122736499 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd) Dec 6 05:10:46 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:10:46 localhost podman[243683]: 2025-12-06 10:10:46.599090847 +0000 UTC m=+0.133420998 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:10:46 localhost podman[243683]: 2025-12-06 10:10:46.679118711 +0000 UTC m=+0.213448882 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:10:46 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:10:47 localhost nova_compute[237281]: 2025-12-06 10:10:47.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:10:47 localhost nova_compute[237281]: 2025-12-06 10:10:47.907 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:10:47 localhost nova_compute[237281]: 2025-12-06 10:10:47.908 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:10:47 localhost nova_compute[237281]: 2025-12-06 10:10:47.908 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:10:47 localhost nova_compute[237281]: 2025-12-06 10:10:47.908 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:10:47 localhost nova_compute[237281]: 2025-12-06 10:10:47.977 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.054 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.055 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.110 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.111 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.185 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.186 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.259 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.464 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.467 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12663MB free_disk=387.310848236084GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.468 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.468 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.529 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.585 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.586 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.587 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.651 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.887 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.891 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:10:48 localhost nova_compute[237281]: 2025-12-06 10:10:48.892 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:10:53 localhost podman[197801]: time="2025-12-06T10:10:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:10:53 localhost podman[197801]: @ - - [06/Dec/2025:10:10:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:10:53 localhost podman[197801]: @ - - [06/Dec/2025:10:10:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15470 "" "Go-http-client/1.1" Dec 6 05:10:53 localhost nova_compute[237281]: 2025-12-06 10:10:53.531 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:53 localhost nova_compute[237281]: 2025-12-06 10:10:53.533 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:10:53 localhost nova_compute[237281]: 2025-12-06 10:10:53.533 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:10:53 localhost nova_compute[237281]: 2025-12-06 10:10:53.534 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:53 localhost nova_compute[237281]: 2025-12-06 10:10:53.565 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:53 localhost nova_compute[237281]: 2025-12-06 10:10:53.566 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:10:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12942 DF PROTO=TCP SPT=58870 DPT=9102 SEQ=939677460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2DBB90000000001030307) Dec 6 05:10:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:10:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:10:54 localhost podman[243737]: 2025-12-06 10:10:54.558791164 +0000 UTC m=+0.091728934 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:10:54 localhost podman[243737]: 2025-12-06 10:10:54.590280564 +0000 UTC m=+0.123218304 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:10:54 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:10:54 localhost podman[243738]: 2025-12-06 10:10:54.610823926 +0000 UTC m=+0.138759992 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Dec 6 05:10:54 localhost podman[243738]: 2025-12-06 10:10:54.625344923 +0000 UTC m=+0.153280999 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm) Dec 6 05:10:54 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:10:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12943 DF PROTO=TCP SPT=58870 DPT=9102 SEQ=939677460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2DFC70000000001030307) Dec 6 05:10:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37058 DF PROTO=TCP SPT=52104 DPT=9102 SEQ=3555361610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2E3880000000001030307) Dec 6 05:10:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12944 DF PROTO=TCP SPT=58870 DPT=9102 SEQ=939677460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2E7C70000000001030307) Dec 6 05:10:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19233 DF PROTO=TCP SPT=36172 DPT=9102 SEQ=1718763582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2EB870000000001030307) Dec 6 05:10:58 localhost nova_compute[237281]: 2025-12-06 10:10:58.566 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:10:58 localhost sshd[243774]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:11:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12945 DF PROTO=TCP SPT=58870 DPT=9102 SEQ=939677460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD2F7870000000001030307) Dec 6 05:11:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:11:02 localhost podman[243776]: 2025-12-06 10:11:02.551009192 +0000 UTC m=+0.087721870 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public) Dec 6 05:11:02 localhost podman[243776]: 2025-12-06 10:11:02.567325265 +0000 UTC m=+0.104037933 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9) Dec 6 05:11:02 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:11:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:11:03 localhost podman[243796]: 2025-12-06 10:11:03.560930687 +0000 UTC m=+0.089241398 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:11:03 localhost nova_compute[237281]: 2025-12-06 10:11:03.568 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:03 localhost nova_compute[237281]: 2025-12-06 10:11:03.569 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:03 localhost nova_compute[237281]: 2025-12-06 10:11:03.569 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:03 localhost nova_compute[237281]: 2025-12-06 10:11:03.569 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:03 localhost podman[243796]: 2025-12-06 10:11:03.571158681 +0000 UTC m=+0.099469352 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:11:03 localhost nova_compute[237281]: 2025-12-06 10:11:03.590 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:03 localhost nova_compute[237281]: 2025-12-06 10:11:03.590 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:03 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:11:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:11:06.694 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:11:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:11:06.695 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:11:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:11:06.696 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:11:08 localhost nova_compute[237281]: 2025-12-06 10:11:08.592 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:08 localhost nova_compute[237281]: 2025-12-06 10:11:08.593 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:08 localhost nova_compute[237281]: 2025-12-06 10:11:08.593 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:08 localhost nova_compute[237281]: 2025-12-06 10:11:08.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:08 localhost nova_compute[237281]: 2025-12-06 10:11:08.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:08 localhost nova_compute[237281]: 2025-12-06 10:11:08.596 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12946 DF PROTO=TCP SPT=58870 DPT=9102 SEQ=939677460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD317870000000001030307) Dec 6 05:11:13 localhost nova_compute[237281]: 2025-12-06 10:11:13.596 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:11:14 localhost podman[243819]: 2025-12-06 10:11:14.551680583 +0000 UTC m=+0.082328305 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 6 05:11:14 localhost podman[243819]: 2025-12-06 10:11:14.615288451 +0000 UTC m=+0.145936143 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:11:14 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:11:16 localhost openstack_network_exporter[199751]: ERROR 10:11:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:11:16 localhost openstack_network_exporter[199751]: ERROR 10:11:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:11:16 localhost openstack_network_exporter[199751]: ERROR 10:11:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:11:16 localhost openstack_network_exporter[199751]: ERROR 10:11:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:11:16 localhost openstack_network_exporter[199751]: Dec 6 05:11:16 localhost openstack_network_exporter[199751]: ERROR 10:11:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:11:16 localhost openstack_network_exporter[199751]: Dec 6 05:11:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:11:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:11:17 localhost podman[243844]: 2025-12-06 10:11:17.555789644 +0000 UTC m=+0.078193407 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:11:17 localhost podman[243844]: 2025-12-06 10:11:17.561248592 +0000 UTC m=+0.083652375 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:11:17 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:11:17 localhost podman[243845]: 2025-12-06 10:11:17.614434029 +0000 UTC m=+0.133883181 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible) Dec 6 05:11:17 localhost podman[243845]: 2025-12-06 10:11:17.656256087 +0000 UTC m=+0.175705279 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:11:17 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:11:18 localhost nova_compute[237281]: 2025-12-06 10:11:18.599 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:18 localhost nova_compute[237281]: 2025-12-06 10:11:18.601 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:18 localhost nova_compute[237281]: 2025-12-06 10:11:18.601 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:18 localhost nova_compute[237281]: 2025-12-06 10:11:18.601 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:18 localhost nova_compute[237281]: 2025-12-06 10:11:18.643 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:18 localhost nova_compute[237281]: 2025-12-06 10:11:18.643 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:23 localhost podman[197801]: time="2025-12-06T10:11:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:11:23 localhost podman[197801]: @ - - [06/Dec/2025:10:11:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:11:23 localhost podman[197801]: @ - - [06/Dec/2025:10:11:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15471 "" "Go-http-client/1.1" Dec 6 05:11:23 localhost nova_compute[237281]: 2025-12-06 10:11:23.644 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:23 localhost nova_compute[237281]: 2025-12-06 10:11:23.646 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:23 localhost nova_compute[237281]: 2025-12-06 10:11:23.647 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:23 localhost nova_compute[237281]: 2025-12-06 10:11:23.647 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:23 localhost nova_compute[237281]: 2025-12-06 10:11:23.681 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:23 localhost nova_compute[237281]: 2025-12-06 10:11:23.683 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3314 DF PROTO=TCP SPT=41142 DPT=9102 SEQ=3855813475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD350EA0000000001030307) Dec 6 05:11:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3315 DF PROTO=TCP SPT=41142 DPT=9102 SEQ=3855813475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD355080000000001030307) Dec 6 05:11:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:11:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:11:25 localhost podman[243885]: 2025-12-06 10:11:25.552479053 +0000 UTC m=+0.080145158 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:11:25 localhost podman[243885]: 2025-12-06 10:11:25.561304205 +0000 UTC m=+0.088970360 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:11:25 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:11:25 localhost podman[243886]: 2025-12-06 10:11:25.612747629 +0000 UTC m=+0.136114441 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:11:25 localhost podman[243886]: 2025-12-06 10:11:25.622762256 +0000 UTC m=+0.146129048 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Dec 6 05:11:25 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:11:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12947 DF PROTO=TCP SPT=58870 DPT=9102 SEQ=939677460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD357870000000001030307) Dec 6 05:11:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3316 DF PROTO=TCP SPT=41142 DPT=9102 SEQ=3855813475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD35D080000000001030307) Dec 6 05:11:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37059 DF PROTO=TCP SPT=52104 DPT=9102 SEQ=3555361610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD361880000000001030307) Dec 6 05:11:28 localhost nova_compute[237281]: 2025-12-06 10:11:28.683 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:28 localhost nova_compute[237281]: 2025-12-06 10:11:28.685 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3317 DF PROTO=TCP SPT=41142 DPT=9102 SEQ=3855813475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD36CC70000000001030307) Dec 6 05:11:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:11:33 localhost systemd[1]: tmp-crun.nsDDyY.mount: Deactivated successfully. Dec 6 05:11:33 localhost podman[243923]: 2025-12-06 10:11:33.232477591 +0000 UTC m=+0.067605682 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41) Dec 6 05:11:33 localhost podman[243923]: 2025-12-06 10:11:33.242481839 +0000 UTC m=+0.077609940 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc.) Dec 6 05:11:33 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:11:33 localhost nova_compute[237281]: 2025-12-06 10:11:33.685 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:33 localhost nova_compute[237281]: 2025-12-06 10:11:33.687 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:33 localhost nova_compute[237281]: 2025-12-06 10:11:33.687 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:33 localhost nova_compute[237281]: 2025-12-06 10:11:33.687 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:33 localhost nova_compute[237281]: 2025-12-06 10:11:33.688 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:33 localhost nova_compute[237281]: 2025-12-06 10:11:33.690 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:11:34 localhost podman[243944]: 2025-12-06 10:11:34.545225816 +0000 UTC m=+0.081897452 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:11:34 localhost podman[243944]: 2025-12-06 10:11:34.551180899 +0000 UTC m=+0.087852515 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:11:34 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:11:34 localhost nova_compute[237281]: 2025-12-06 10:11:34.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:37 localhost nova_compute[237281]: 2025-12-06 10:11:37.900 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:38 localhost nova_compute[237281]: 2025-12-06 10:11:38.689 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3318 DF PROTO=TCP SPT=41142 DPT=9102 SEQ=3855813475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD38D870000000001030307) Dec 6 05:11:39 localhost nova_compute[237281]: 2025-12-06 10:11:39.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:39 localhost nova_compute[237281]: 2025-12-06 10:11:39.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:39 localhost nova_compute[237281]: 2025-12-06 10:11:39.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:11:40 localhost nova_compute[237281]: 2025-12-06 10:11:40.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:40 localhost nova_compute[237281]: 2025-12-06 10:11:40.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:11:40 localhost nova_compute[237281]: 2025-12-06 10:11:40.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:11:41 localhost nova_compute[237281]: 2025-12-06 10:11:41.565 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:11:41 localhost nova_compute[237281]: 2025-12-06 10:11:41.566 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:11:41 localhost nova_compute[237281]: 2025-12-06 10:11:41.566 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:11:41 localhost nova_compute[237281]: 2025-12-06 10:11:41.567 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:11:43 localhost nova_compute[237281]: 2025-12-06 10:11:43.693 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:43 localhost nova_compute[237281]: 2025-12-06 10:11:43.695 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:43 localhost nova_compute[237281]: 2025-12-06 10:11:43.696 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:43 localhost nova_compute[237281]: 2025-12-06 10:11:43.696 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:43 localhost nova_compute[237281]: 2025-12-06 10:11:43.713 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:43 localhost nova_compute[237281]: 2025-12-06 10:11:43.713 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:44 localhost nova_compute[237281]: 2025-12-06 10:11:44.022 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:11:44 localhost nova_compute[237281]: 2025-12-06 10:11:44.037 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:11:44 localhost nova_compute[237281]: 2025-12-06 10:11:44.037 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:11:44 localhost nova_compute[237281]: 2025-12-06 10:11:44.037 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:44 localhost nova_compute[237281]: 2025-12-06 10:11:44.038 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:45 localhost nova_compute[237281]: 2025-12-06 10:11:45.031 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:11:45 localhost podman[243966]: 2025-12-06 10:11:45.540790873 +0000 UTC m=+0.068849289 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller) Dec 6 05:11:45 localhost podman[243966]: 2025-12-06 10:11:45.681171014 +0000 UTC m=+0.209229550 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:11:45 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:11:45 localhost nova_compute[237281]: 2025-12-06 10:11:45.880 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:45 localhost nova_compute[237281]: 2025-12-06 10:11:45.901 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:45 localhost nova_compute[237281]: 2025-12-06 10:11:45.901 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 05:11:45 localhost nova_compute[237281]: 2025-12-06 10:11:45.923 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 05:11:46 localhost openstack_network_exporter[199751]: ERROR 10:11:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:11:46 localhost openstack_network_exporter[199751]: ERROR 10:11:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:11:46 localhost openstack_network_exporter[199751]: ERROR 10:11:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:11:46 localhost openstack_network_exporter[199751]: ERROR 10:11:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:11:46 localhost openstack_network_exporter[199751]: Dec 6 05:11:46 localhost openstack_network_exporter[199751]: ERROR 10:11:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:11:46 localhost openstack_network_exporter[199751]: Dec 6 05:11:47 localhost nova_compute[237281]: 2025-12-06 10:11:47.908 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:47 localhost nova_compute[237281]: 2025-12-06 10:11:47.910 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:47 localhost nova_compute[237281]: 2025-12-06 10:11:47.932 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:11:47 localhost nova_compute[237281]: 2025-12-06 10:11:47.933 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:11:47 localhost nova_compute[237281]: 2025-12-06 10:11:47.934 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:11:47 localhost nova_compute[237281]: 2025-12-06 10:11:47.934 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.005 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.080 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.083 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.163 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.165 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.243 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.245 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.307 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.546 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.547 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12654MB free_disk=387.30889892578125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.547 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.548 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:11:48 localhost podman[244002]: 2025-12-06 10:11:48.554201232 +0000 UTC m=+0.079571921 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:11:48 localhost podman[244002]: 2025-12-06 10:11:48.562151187 +0000 UTC m=+0.087521876 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:11:48 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:11:48 localhost podman[244003]: 2025-12-06 10:11:48.610110653 +0000 UTC m=+0.136081510 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:11:48 localhost podman[244003]: 2025-12-06 10:11:48.624141565 +0000 UTC m=+0.150112352 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:11:48 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.741 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.741 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.742 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.742 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.742 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.744 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.778 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.779 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.779 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:11:48 localhost nova_compute[237281]: 2025-12-06 10:11:48.890 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.022 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.023 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.055 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.091 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.153 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.179 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.181 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.182 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.183 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:11:49 localhost nova_compute[237281]: 2025-12-06 10:11:49.183 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 05:11:53 localhost podman[197801]: time="2025-12-06T10:11:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:11:53 localhost podman[197801]: @ - - [06/Dec/2025:10:11:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:11:53 localhost podman[197801]: @ - - [06/Dec/2025:10:11:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15472 "" "Go-http-client/1.1" Dec 6 05:11:53 localhost nova_compute[237281]: 2025-12-06 10:11:53.744 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:53 localhost nova_compute[237281]: 2025-12-06 10:11:53.746 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:11:53 localhost nova_compute[237281]: 2025-12-06 10:11:53.746 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:11:53 localhost nova_compute[237281]: 2025-12-06 10:11:53.747 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:53 localhost nova_compute[237281]: 2025-12-06 10:11:53.787 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:53 localhost nova_compute[237281]: 2025-12-06 10:11:53.788 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:11:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34457 DF PROTO=TCP SPT=40126 DPT=9102 SEQ=3508394433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD3C6190000000001030307) Dec 6 05:11:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34458 DF PROTO=TCP SPT=40126 DPT=9102 SEQ=3508394433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD3CA080000000001030307) Dec 6 05:11:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3319 DF PROTO=TCP SPT=41142 DPT=9102 SEQ=3855813475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD3CD880000000001030307) Dec 6 05:11:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:11:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:11:56 localhost systemd[1]: tmp-crun.mbbKeJ.mount: Deactivated successfully. Dec 6 05:11:56 localhost podman[244042]: 2025-12-06 10:11:56.555558602 +0000 UTC m=+0.087736762 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:11:56 localhost podman[244042]: 2025-12-06 10:11:56.584170862 +0000 UTC m=+0.116348972 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:11:56 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:11:56 localhost podman[244043]: 2025-12-06 10:11:56.601884058 +0000 UTC m=+0.130617432 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:11:56 localhost podman[244043]: 2025-12-06 10:11:56.615208417 +0000 UTC m=+0.143941791 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:11:56 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:11:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34459 DF PROTO=TCP SPT=40126 DPT=9102 SEQ=3508394433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD3D2070000000001030307) Dec 6 05:11:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12948 DF PROTO=TCP SPT=58870 DPT=9102 SEQ=939677460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD3D5880000000001030307) Dec 6 05:11:58 localhost nova_compute[237281]: 2025-12-06 10:11:58.788 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:11:58 localhost nova_compute[237281]: 2025-12-06 10:11:58.791 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34460 DF PROTO=TCP SPT=40126 DPT=9102 SEQ=3508394433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD3E1C80000000001030307) Dec 6 05:12:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:12:03 localhost podman[244077]: 2025-12-06 10:12:03.542664393 +0000 UTC m=+0.080138996 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., release=1755695350) Dec 6 05:12:03 localhost podman[244077]: 2025-12-06 10:12:03.556783158 +0000 UTC m=+0.094257721 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 6 05:12:03 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:12:03 localhost nova_compute[237281]: 2025-12-06 10:12:03.793 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:03 localhost nova_compute[237281]: 2025-12-06 10:12:03.794 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:03 localhost nova_compute[237281]: 2025-12-06 10:12:03.794 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:03 localhost nova_compute[237281]: 2025-12-06 10:12:03.794 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:03 localhost nova_compute[237281]: 2025-12-06 10:12:03.795 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:03 localhost nova_compute[237281]: 2025-12-06 10:12:03.797 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:12:05 localhost podman[244098]: 2025-12-06 10:12:05.540557085 +0000 UTC m=+0.077771204 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:12:05 localhost podman[244098]: 2025-12-06 10:12:05.576331577 +0000 UTC m=+0.113545676 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:12:05 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:12:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:12:06.696 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:12:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:12:06.696 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:12:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:12:06.697 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:12:08 localhost nova_compute[237281]: 2025-12-06 10:12:08.799 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34461 DF PROTO=TCP SPT=40126 DPT=9102 SEQ=3508394433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD401870000000001030307) Dec 6 05:12:13 localhost nova_compute[237281]: 2025-12-06 10:12:13.801 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:13 localhost nova_compute[237281]: 2025-12-06 10:12:13.804 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:13 localhost nova_compute[237281]: 2025-12-06 10:12:13.804 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:13 localhost nova_compute[237281]: 2025-12-06 10:12:13.804 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:13 localhost nova_compute[237281]: 2025-12-06 10:12:13.843 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:13 localhost nova_compute[237281]: 2025-12-06 10:12:13.844 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:16 localhost openstack_network_exporter[199751]: ERROR 10:12:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:12:16 localhost openstack_network_exporter[199751]: ERROR 10:12:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:12:16 localhost openstack_network_exporter[199751]: ERROR 10:12:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:12:16 localhost openstack_network_exporter[199751]: ERROR 10:12:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:12:16 localhost openstack_network_exporter[199751]: Dec 6 05:12:16 localhost openstack_network_exporter[199751]: ERROR 10:12:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:12:16 localhost openstack_network_exporter[199751]: Dec 6 05:12:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:12:16 localhost podman[244121]: 2025-12-06 10:12:16.543614283 +0000 UTC m=+0.075884797 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true) Dec 6 05:12:16 localhost podman[244121]: 2025-12-06 10:12:16.58218596 +0000 UTC m=+0.114456484 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:12:16 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.413 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.434 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Triggering sync for uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.435 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.435 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.458 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.844 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.846 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.847 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.847 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.880 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:18 localhost nova_compute[237281]: 2025-12-06 10:12:18.881 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:12:19 localhost podman[244147]: 2025-12-06 10:12:19.52483707 +0000 UTC m=+0.063120213 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:12:19 localhost podman[244147]: 2025-12-06 10:12:19.537362856 +0000 UTC m=+0.075646039 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:12:19 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:12:19 localhost systemd[1]: tmp-crun.Tg6qhv.mount: Deactivated successfully. Dec 6 05:12:19 localhost podman[244146]: 2025-12-06 10:12:19.583402873 +0000 UTC m=+0.122659586 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:12:19 localhost podman[244146]: 2025-12-06 10:12:19.612357985 +0000 UTC m=+0.151614688 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:12:19 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.990 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.994 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51f1fb88-874f-42ab-bdd5-98db5800c09e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:22.991360', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0e8e67fa-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': '80dc91f425dac1cda250d509ad13dbfb66f5688369856edbe948a9b188376a6a'}]}, 'timestamp': '2025-12-06 10:12:22.995681', '_unique_id': 'f2d5683f458d4c3bbc75ef96a062b649'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.997 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:22.999 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94babf25-749a-4c06-91c6-30f7829408ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:22.999105', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0e8f0764-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': '8301241d75765a18df834c2fd4864bfe0b6ee548a239baf2a29c927403af84b9'}]}, 'timestamp': '2025-12-06 10:12:22.999719', '_unique_id': '712cdda15d1d48ff972e1a0d45c4a086'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.046 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.047 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c0ce060-4a5c-4bb9-ba68-0a4ba186d0dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.002658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0e965276-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '75187962cf99d3f66e8fffdad353cb8dc324e58fbf962b5caec123c215a6fdad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.002658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0e967242-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '715fbd307acc840e83f89445f17eddfe7aaac9e8b5e5882f560c09b403e6122c'}]}, 'timestamp': '2025-12-06 10:12:23.048280', '_unique_id': '9c532192d0494873b155179ee87ff6e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.051 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.051 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.051 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '632017b1-6068-454a-8795-523b277c5d1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.051747', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0e971134-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': 'eaeec39a73531516ff44a1bb87ffde96399c40255c5bbbf21d4d3388570368e2'}]}, 'timestamp': '2025-12-06 10:12:23.052368', '_unique_id': '31e491586e6f4d14b7fa32ba3baf1417'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.055 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.055 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.056 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cb5724b-65ca-48b9-9703-fbe0fe325e5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.055605', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0e97a702-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': 'd33bb97b3155154e0558f7ceb03a851d7416a7d7f97bccd6b64cfd1919334031'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.055605', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0e97bb20-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '5ae482c26385a999763d980b73aadc981fedf483cab04b6fc8898892183f732a'}]}, 'timestamp': '2025-12-06 10:12:23.056683', '_unique_id': 'c344ccabc74a450fb6f991d1302370e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.057 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.077 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42924440-a411-4424-9203-ab7fee636199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.059616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0e9ae1a6-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.269752407, 'message_signature': 'f728f96ee95ef71f7bd57b08c64616568d1e8078dd21f6bfc6822eb37e70dbfa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.059616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0e9af97a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.269752407, 'message_signature': '78b042214bed500a55dfb434e754d883bd4c9182aa086ef1b1fb485d2ef17ed6'}]}, 'timestamp': '2025-12-06 10:12:23.078043', '_unique_id': '06bf20ec50b7445f8adb23da2b380d9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.079 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.081 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.081 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5154510-2c07-402f-8ac5-f1441bf9d66e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.081036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0e9b894e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.269752407, 'message_signature': 'f052c3b8068ce712bb85b2de8cfd80f63070ece5eb2620cf6ada019f79c5ffb8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.081036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0e9ba3de-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.269752407, 'message_signature': '5a9133d4af88f23d75d88c00bf65ce6578ef6c5f8bdb7dcbcf49ec2676357a48'}]}, 'timestamp': '2025-12-06 10:12:23.082493', '_unique_id': 'ff49c388e3e24194a34ec0636a74f664'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.086 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.086 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb8da7dd-ec04-4993-8a73-2fec48bd7b5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.085989', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0e9c4aaa-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.269752407, 'message_signature': 'be78c77e276f94c7faef256de0c6a9e55b3e9808a727b12ebfc5376f6f9da610'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.085989', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0e9c6576-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.269752407, 'message_signature': '3039569b9fb19ce1680c25059ef0242e0e38048a756e33e182162b38e2d38611'}]}, 'timestamp': '2025-12-06 10:12:23.087365', '_unique_id': '09aa9976df844371b705a8309afd15b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.091 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc053180-359f-41bf-ad40-861b2334c1f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.090726', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0e9d0576-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '96dce7c8f8257364e75c565ff8d695e3120d75d9f96c2d8bf84e1fed06f55872'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.090726', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0e9d1fd4-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '60a28f9d03381db7a631e310d83cc65dab5628364d5f1d1e2fb0d67e2471c6e7'}]}, 'timestamp': '2025-12-06 10:12:23.092170', '_unique_id': '0a7bdb15457d473b81a952c2cdfb3af7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.093 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.095 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '552a0b1c-84c2-498c-b465-2252ab196dd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:12:23.095775', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0ea0826e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.323597424, 'message_signature': 'f9fe1b4570dba3715b489ed57a6310395c395d9be92106b965c170ac4d93682a'}]}, 'timestamp': '2025-12-06 10:12:23.114296', '_unique_id': 'd514fe7dea1d48b29d181ee47fcb1501'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df8d905-feb7-408a-97da-b3f66298a28c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.116977', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0ea10568-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '4f60d1028e0954a2abd46f7afaed53075a44befbc96a26408987d68a3b3c0bd0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.116977', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0ea11ff8-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '23a75718e0a3090ba38bbbb67287bfa41e26a40482adfab21b611c7ab346df15'}]}, 'timestamp': '2025-12-06 10:12:23.118350', '_unique_id': 'f48676f96af545a1a80a296d8ea43b48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb59342a-1854-43c8-903a-c55d649d5ecb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.121760', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0ea1c228-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': 'f9c42b9c90a6e12621d3ed9efd7f69e2a9c3875ff93370c764291ec652bf6756'}]}, 'timestamp': '2025-12-06 10:12:23.122548', '_unique_id': 'f7d1cd128a08425386b2b261ab45e19a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e342c5c9-e27a-4ad4-8a4e-59c3e0bd419a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.125923', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0ea26886-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': '65ea8406fdd288fae1f1815a4c23faa368dc9a27b7d8cef8f9c8d5cb76feefbf'}]}, 'timestamp': '2025-12-06 10:12:23.126801', '_unique_id': '564a79d8dfaf4028a273e1778b683955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72b630f8-bb5b-4fda-8dcd-ac0dc422b9f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.130492', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0ea3163c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': 'a31585c119cb9938d37a6209004574b60c629ba54afef4031ef847bab636937d'}]}, 'timestamp': '2025-12-06 10:12:23.131244', '_unique_id': 'd95aaf23ea7a4edeb497a4eee26cd1b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9610f9e6-a473-4a39-8dbb-f2d3191b57e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.134560', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0ea3b614-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': '7be01df8f2940b43d6946bb870339730c54af8929b8ee8562eebe10a85ebbf28'}]}, 'timestamp': '2025-12-06 10:12:23.135356', '_unique_id': '1524b1358f764a69995a8f566ad1479f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.137 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b335eeb-2d9c-4d85-85a6-4939cf563835', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.137838', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0ea42f86-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': 'd5064e07447e57c571f0105f4df534b1f901704981df53a7cbef1c9a43cbd412'}]}, 'timestamp': '2025-12-06 10:12:23.138317', '_unique_id': 'd98e4a793e144c81959a77ef8cd1ca5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.140 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.141 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '267a9a6d-9183-4af6-81dc-8d31089d74c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.140583', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0ea4999e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '295195cfe178c0b97c9ac4fd00a0165353882b59fa28d8504aa16ca0f5ba51d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.140583', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0ea4aa24-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': '077e80002a94bf569c86fe220f99cd948f9fccb83fac39732fb2fbf220cbf0df'}]}, 'timestamp': '2025-12-06 10:12:23.141424', '_unique_id': '58b4a4bfb0c2449eb2b016b9d77ffed3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.142 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.143 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c5d3668-b52d-4e0f-b28b-59c63cd3dfc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.143667', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0ea512e8-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': '2630c26eb40e8587716245c3251059073d764952689973d4d222eab310adaeed'}]}, 'timestamp': '2025-12-06 10:12:23.144130', '_unique_id': '9ca53f7f60ba4f2c9133ccc192a8871d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.144 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.146 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '999e8ba5-6c78-4c84-a62d-a635f75dad52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:12:23.146347', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0ea57a80-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.201479936, 'message_signature': 'd5f34ab1b93a55c93cb33d978908d9b7b66673f6ceb821d6bc20b5012885f596'}]}, 'timestamp': '2025-12-06 10:12:23.146781', '_unique_id': 'b6c58e28b66f43b9b7e648c2779053eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.148 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.149 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '335f0f47-c565-45de-b056-66dd7fcf1956', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:12:23.148752', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0ea5d976-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': 'fa5f748eede7a5c617eb3d7f3616086f46b11ec0d7ae5b72f1a3ed7efdab93f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:12:23.148752', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0ea5e8f8-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.212775404, 'message_signature': 'e2cfe291e4df24855316af0940af92159c90ebd671889f94f674968f2f53dd64'}]}, 'timestamp': '2025-12-06 10:12:23.149590', '_unique_id': 'c9f4de317ee14fdb98d311aa97bf13b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.150 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.151 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.151 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 16060000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ede2216a-1878-44f9-be2b-ac587b267b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16060000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:12:23.151603', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0ea647da-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12330.323597424, 'message_signature': '6d80e1ec817a94b63cf912907472c5dc16f936768386cd0e9980869cb34dda0e'}]}, 'timestamp': '2025-12-06 10:12:23.152054', '_unique_id': '2b531ed4cda34b078ef79b5d1940200f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.152 12 ERROR oslo_messaging.notify.messaging Dec 6 05:12:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:12:23.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:12:23 localhost podman[197801]: time="2025-12-06T10:12:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:12:23 localhost podman[197801]: @ - - [06/Dec/2025:10:12:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:12:23 localhost podman[197801]: @ - - [06/Dec/2025:10:12:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15470 "" "Go-http-client/1.1" Dec 6 05:12:23 localhost nova_compute[237281]: 2025-12-06 10:12:23.882 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:23 localhost nova_compute[237281]: 2025-12-06 10:12:23.884 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:23 localhost nova_compute[237281]: 2025-12-06 10:12:23.884 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:23 localhost nova_compute[237281]: 2025-12-06 10:12:23.884 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:23 localhost nova_compute[237281]: 2025-12-06 10:12:23.913 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:23 localhost nova_compute[237281]: 2025-12-06 10:12:23.914 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=585 DF PROTO=TCP SPT=54974 DPT=9102 SEQ=2130583086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD43B490000000001030307) Dec 6 05:12:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=586 DF PROTO=TCP SPT=54974 DPT=9102 SEQ=2130583086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD43F470000000001030307) Dec 6 05:12:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34462 DF PROTO=TCP SPT=40126 DPT=9102 SEQ=3508394433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD441870000000001030307) Dec 6 05:12:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=587 DF PROTO=TCP SPT=54974 DPT=9102 SEQ=2130583086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD447470000000001030307) Dec 6 05:12:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:12:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:12:27 localhost systemd[1]: tmp-crun.Ard3eO.mount: Deactivated successfully. Dec 6 05:12:27 localhost podman[244188]: 2025-12-06 10:12:27.178927242 +0000 UTC m=+0.085960377 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:12:27 localhost podman[244188]: 2025-12-06 10:12:27.188410823 +0000 UTC m=+0.095443958 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:12:27 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:12:27 localhost podman[244189]: 2025-12-06 10:12:27.240627881 +0000 UTC m=+0.140505567 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:12:27 localhost podman[244189]: 2025-12-06 10:12:27.253266249 +0000 UTC m=+0.153143885 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125) Dec 6 05:12:27 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:12:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3320 DF PROTO=TCP SPT=41142 DPT=9102 SEQ=3855813475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD44B870000000001030307) Dec 6 05:12:28 localhost nova_compute[237281]: 2025-12-06 10:12:28.914 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:28 localhost nova_compute[237281]: 2025-12-06 10:12:28.916 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:28 localhost nova_compute[237281]: 2025-12-06 10:12:28.916 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:28 localhost nova_compute[237281]: 2025-12-06 10:12:28.917 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:28 localhost nova_compute[237281]: 2025-12-06 10:12:28.917 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:28 localhost nova_compute[237281]: 2025-12-06 10:12:28.918 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=588 DF PROTO=TCP SPT=54974 DPT=9102 SEQ=2130583086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD457080000000001030307) Dec 6 05:12:33 localhost nova_compute[237281]: 2025-12-06 10:12:33.919 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:33 localhost nova_compute[237281]: 2025-12-06 10:12:33.921 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:33 localhost nova_compute[237281]: 2025-12-06 10:12:33.921 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:33 localhost nova_compute[237281]: 2025-12-06 10:12:33.922 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:33 localhost nova_compute[237281]: 2025-12-06 10:12:33.951 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:33 localhost nova_compute[237281]: 2025-12-06 10:12:33.952 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:12:34 localhost podman[244226]: 2025-12-06 10:12:34.542065757 +0000 UTC m=+0.078304711 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9) Dec 6 05:12:34 localhost podman[244226]: 2025-12-06 10:12:34.557135721 +0000 UTC m=+0.093374675 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7) Dec 6 05:12:34 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:12:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:12:36 localhost podman[244244]: 2025-12-06 10:12:36.561829333 +0000 UTC m=+0.091442215 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:12:36 localhost podman[244244]: 2025-12-06 10:12:36.572331477 +0000 UTC m=+0.101944389 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:12:36 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:12:37 localhost nova_compute[237281]: 2025-12-06 10:12:37.907 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:38 localhost nova_compute[237281]: 2025-12-06 10:12:38.953 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=589 DF PROTO=TCP SPT=54974 DPT=9102 SEQ=2130583086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD477870000000001030307) Dec 6 05:12:39 localhost nova_compute[237281]: 2025-12-06 10:12:39.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:41 localhost nova_compute[237281]: 2025-12-06 10:12:41.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:41 localhost nova_compute[237281]: 2025-12-06 10:12:41.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:12:42 localhost nova_compute[237281]: 2025-12-06 10:12:42.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:42 localhost nova_compute[237281]: 2025-12-06 10:12:42.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:12:42 localhost nova_compute[237281]: 2025-12-06 10:12:42.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.515 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.516 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.516 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.516 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.955 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.957 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.957 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.958 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.959 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:43 localhost nova_compute[237281]: 2025-12-06 10:12:43.963 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:46 localhost nova_compute[237281]: 2025-12-06 10:12:46.181 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:12:46 localhost openstack_network_exporter[199751]: ERROR 10:12:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:12:46 localhost openstack_network_exporter[199751]: ERROR 10:12:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:12:46 localhost openstack_network_exporter[199751]: ERROR 10:12:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:12:46 localhost openstack_network_exporter[199751]: ERROR 10:12:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:12:46 localhost openstack_network_exporter[199751]: Dec 6 05:12:46 localhost openstack_network_exporter[199751]: ERROR 10:12:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:12:46 localhost openstack_network_exporter[199751]: Dec 6 05:12:46 localhost nova_compute[237281]: 2025-12-06 10:12:46.204 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:12:46 localhost nova_compute[237281]: 2025-12-06 10:12:46.205 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:12:46 localhost nova_compute[237281]: 2025-12-06 10:12:46.206 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:46 localhost nova_compute[237281]: 2025-12-06 10:12:46.207 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:47 localhost nova_compute[237281]: 2025-12-06 10:12:47.202 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:12:47 localhost podman[244266]: 2025-12-06 10:12:47.550709246 +0000 UTC m=+0.079256041 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:12:47 localhost podman[244266]: 2025-12-06 10:12:47.588183739 +0000 UTC m=+0.116730554 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:12:47 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:12:47 localhost nova_compute[237281]: 2025-12-06 10:12:47.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:47 localhost nova_compute[237281]: 2025-12-06 10:12:47.912 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:12:47 localhost nova_compute[237281]: 2025-12-06 10:12:47.913 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:12:47 localhost nova_compute[237281]: 2025-12-06 10:12:47.913 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:12:47 localhost nova_compute[237281]: 2025-12-06 10:12:47.914 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:12:47 localhost nova_compute[237281]: 2025-12-06 10:12:47.984 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.058 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.060 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.131 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.132 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.203 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.204 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.261 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.471 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.472 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12669MB free_disk=387.30889892578125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.473 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.474 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.552 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.552 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.553 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.601 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.619 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.622 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.622 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:12:48 localhost nova_compute[237281]: 2025-12-06 10:12:48.960 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:49 localhost nova_compute[237281]: 2025-12-06 10:12:49.623 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:12:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:12:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:12:50 localhost podman[244303]: 2025-12-06 10:12:50.549112693 +0000 UTC m=+0.079610622 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:12:50 localhost podman[244303]: 2025-12-06 10:12:50.560242466 +0000 UTC m=+0.090740395 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:12:50 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:12:50 localhost systemd[1]: tmp-crun.z7imjb.mount: Deactivated successfully. Dec 6 05:12:50 localhost podman[244304]: 2025-12-06 10:12:50.607557172 +0000 UTC m=+0.133679776 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:12:50 localhost podman[244304]: 2025-12-06 10:12:50.619237491 +0000 UTC m=+0.145360085 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:12:50 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:12:53 localhost podman[197801]: time="2025-12-06T10:12:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:12:53 localhost podman[197801]: @ - - [06/Dec/2025:10:12:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:12:53 localhost podman[197801]: @ - - [06/Dec/2025:10:12:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15466 "" "Go-http-client/1.1" Dec 6 05:12:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37601 DF PROTO=TCP SPT=34804 DPT=9102 SEQ=3853356129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD4B07A0000000001030307) Dec 6 05:12:54 localhost nova_compute[237281]: 2025-12-06 10:12:54.003 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:12:54 localhost nova_compute[237281]: 2025-12-06 10:12:54.004 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:54 localhost nova_compute[237281]: 2025-12-06 10:12:54.005 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:12:54 localhost nova_compute[237281]: 2025-12-06 10:12:54.005 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:54 localhost nova_compute[237281]: 2025-12-06 10:12:54.006 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:12:54 localhost nova_compute[237281]: 2025-12-06 10:12:54.009 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:12:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37602 DF PROTO=TCP SPT=34804 DPT=9102 SEQ=3853356129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD4B4870000000001030307) Dec 6 05:12:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=590 DF PROTO=TCP SPT=54974 DPT=9102 SEQ=2130583086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD4B7880000000001030307) Dec 6 05:12:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37603 DF PROTO=TCP SPT=34804 DPT=9102 SEQ=3853356129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD4BC880000000001030307) Dec 6 05:12:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:12:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:12:57 localhost podman[244345]: 2025-12-06 10:12:57.554441437 +0000 UTC m=+0.084771941 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:12:57 localhost podman[244344]: 2025-12-06 10:12:57.614383471 +0000 UTC m=+0.146558441 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:12:57 localhost podman[244345]: 2025-12-06 10:12:57.641683532 +0000 UTC m=+0.172013976 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:12:57 localhost podman[244344]: 2025-12-06 10:12:57.649359857 +0000 UTC m=+0.181534777 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:12:57 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:12:57 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:12:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34463 DF PROTO=TCP SPT=40126 DPT=9102 SEQ=3508394433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD4BF870000000001030307) Dec 6 05:12:59 localhost nova_compute[237281]: 2025-12-06 10:12:59.008 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37604 DF PROTO=TCP SPT=34804 DPT=9102 SEQ=3853356129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD4CC480000000001030307) Dec 6 05:13:04 localhost nova_compute[237281]: 2025-12-06 10:13:04.012 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:04 localhost nova_compute[237281]: 2025-12-06 10:13:04.014 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:04 localhost nova_compute[237281]: 2025-12-06 10:13:04.015 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:13:04 localhost nova_compute[237281]: 2025-12-06 10:13:04.015 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:04 localhost nova_compute[237281]: 2025-12-06 10:13:04.036 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:04 localhost nova_compute[237281]: 2025-12-06 10:13:04.037 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:13:05 localhost systemd[1]: tmp-crun.reGgFc.mount: Deactivated successfully. Dec 6 05:13:05 localhost podman[244381]: 2025-12-06 10:13:05.540224717 +0000 UTC m=+0.069142229 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal) Dec 6 05:13:05 localhost podman[244381]: 2025-12-06 10:13:05.557443347 +0000 UTC m=+0.086360819 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9) Dec 6 05:13:05 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:13:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:06.696 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:13:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:06.697 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:13:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:06.698 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:13:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:13:07 localhost systemd[1]: tmp-crun.hGTQsY.mount: Deactivated successfully. Dec 6 05:13:07 localhost podman[244403]: 2025-12-06 10:13:07.552531133 +0000 UTC m=+0.086195553 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:13:07 localhost podman[244403]: 2025-12-06 10:13:07.56057427 +0000 UTC m=+0.094238760 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:13:07 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:13:08 localhost sshd[244426]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:13:09 localhost nova_compute[237281]: 2025-12-06 10:13:09.037 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:09 localhost nova_compute[237281]: 2025-12-06 10:13:09.039 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37605 DF PROTO=TCP SPT=34804 DPT=9102 SEQ=3853356129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD4ED870000000001030307) Dec 6 05:13:14 localhost nova_compute[237281]: 2025-12-06 10:13:14.041 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:14 localhost nova_compute[237281]: 2025-12-06 10:13:14.043 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:14 localhost nova_compute[237281]: 2025-12-06 10:13:14.043 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:13:14 localhost nova_compute[237281]: 2025-12-06 10:13:14.044 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:14 localhost nova_compute[237281]: 2025-12-06 10:13:14.068 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:14 localhost nova_compute[237281]: 2025-12-06 10:13:14.069 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:16 localhost openstack_network_exporter[199751]: ERROR 10:13:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:13:16 localhost openstack_network_exporter[199751]: ERROR 10:13:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:13:16 localhost openstack_network_exporter[199751]: ERROR 10:13:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:13:16 localhost openstack_network_exporter[199751]: ERROR 10:13:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:13:16 localhost openstack_network_exporter[199751]: Dec 6 05:13:16 localhost openstack_network_exporter[199751]: ERROR 10:13:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:13:16 localhost openstack_network_exporter[199751]: Dec 6 05:13:17 localhost sshd[244428]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:13:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:13:18 localhost podman[244430]: 2025-12-06 10:13:18.53349241 +0000 UTC m=+0.072394079 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:13:18 localhost podman[244430]: 2025-12-06 10:13:18.628497344 +0000 UTC m=+0.167399043 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2) Dec 6 05:13:18 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:13:19 localhost nova_compute[237281]: 2025-12-06 10:13:19.069 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:19 localhost nova_compute[237281]: 2025-12-06 10:13:19.071 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:13:21 localhost podman[244456]: 2025-12-06 10:13:21.562371394 +0000 UTC m=+0.085415490 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:13:21 localhost podman[244456]: 2025-12-06 10:13:21.577215681 +0000 UTC m=+0.100259817 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:13:21 localhost podman[244455]: 2025-12-06 10:13:21.616359456 +0000 UTC m=+0.143251010 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:13:21 localhost podman[244455]: 2025-12-06 10:13:21.623835525 +0000 UTC m=+0.150727069 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:13:21 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:13:21 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:13:23 localhost podman[197801]: time="2025-12-06T10:13:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:13:23 localhost podman[197801]: @ - - [06/Dec/2025:10:13:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:13:23 localhost podman[197801]: @ - - [06/Dec/2025:10:13:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15469 "" "Go-http-client/1.1" Dec 6 05:13:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16636 DF PROTO=TCP SPT=60888 DPT=9102 SEQ=2387719586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD525A90000000001030307) Dec 6 05:13:24 localhost nova_compute[237281]: 2025-12-06 10:13:24.092 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:24 localhost nova_compute[237281]: 2025-12-06 10:13:24.094 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:24 localhost nova_compute[237281]: 2025-12-06 10:13:24.095 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:13:24 localhost nova_compute[237281]: 2025-12-06 10:13:24.095 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:24 localhost nova_compute[237281]: 2025-12-06 10:13:24.095 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:24 localhost nova_compute[237281]: 2025-12-06 10:13:24.098 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16637 DF PROTO=TCP SPT=60888 DPT=9102 SEQ=2387719586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD529C80000000001030307) Dec 6 05:13:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37606 DF PROTO=TCP SPT=34804 DPT=9102 SEQ=3853356129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD52D870000000001030307) Dec 6 05:13:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16638 DF PROTO=TCP SPT=60888 DPT=9102 SEQ=2387719586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD531C70000000001030307) Dec 6 05:13:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=591 DF PROTO=TCP SPT=54974 DPT=9102 SEQ=2130583086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD535880000000001030307) Dec 6 05:13:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:13:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:13:28 localhost podman[244494]: 2025-12-06 10:13:28.554109313 +0000 UTC m=+0.087223582 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:13:28 localhost podman[244494]: 2025-12-06 10:13:28.587168033 +0000 UTC m=+0.120282332 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Dec 6 05:13:28 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:13:28 localhost podman[244495]: 2025-12-06 10:13:28.589715691 +0000 UTC m=+0.118046762 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 05:13:28 localhost podman[244495]: 2025-12-06 10:13:28.672508046 +0000 UTC m=+0.200839147 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Dec 6 05:13:28 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:13:29 localhost nova_compute[237281]: 2025-12-06 10:13:29.099 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16639 DF PROTO=TCP SPT=60888 DPT=9102 SEQ=2387719586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD541870000000001030307) Dec 6 05:13:34 localhost nova_compute[237281]: 2025-12-06 10:13:34.102 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:34 localhost nova_compute[237281]: 2025-12-06 10:13:34.104 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:34 localhost nova_compute[237281]: 2025-12-06 10:13:34.105 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:13:34 localhost nova_compute[237281]: 2025-12-06 10:13:34.105 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:34 localhost nova_compute[237281]: 2025-12-06 10:13:34.128 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:34 localhost nova_compute[237281]: 2025-12-06 10:13:34.129 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:13:36 localhost podman[244534]: 2025-12-06 10:13:36.55257448 +0000 UTC m=+0.083781185 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7) Dec 6 05:13:36 localhost podman[244534]: 2025-12-06 10:13:36.564629643 +0000 UTC m=+0.095836378 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.expose-services=, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 05:13:36 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:13:37 localhost nova_compute[237281]: 2025-12-06 10:13:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:13:38 localhost podman[244554]: 2025-12-06 10:13:38.546959258 +0000 UTC m=+0.083487986 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:13:38 localhost podman[244554]: 2025-12-06 10:13:38.557438091 +0000 UTC m=+0.093966819 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:13:38 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:13:39 localhost nova_compute[237281]: 2025-12-06 10:13:39.130 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:39 localhost nova_compute[237281]: 2025-12-06 10:13:39.132 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:39 localhost nova_compute[237281]: 2025-12-06 10:13:39.133 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:13:39 localhost nova_compute[237281]: 2025-12-06 10:13:39.133 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:39 localhost nova_compute[237281]: 2025-12-06 10:13:39.163 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:39 localhost nova_compute[237281]: 2025-12-06 10:13:39.164 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16640 DF PROTO=TCP SPT=60888 DPT=9102 SEQ=2387719586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD561870000000001030307) Dec 6 05:13:39 localhost nova_compute[237281]: 2025-12-06 10:13:39.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:42 localhost nova_compute[237281]: 2025-12-06 10:13:42.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:43 localhost nova_compute[237281]: 2025-12-06 10:13:43.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:43 localhost nova_compute[237281]: 2025-12-06 10:13:43.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.165 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.168 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.168 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.169 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.202 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.202 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:13:44 localhost nova_compute[237281]: 2025-12-06 10:13:44.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:13:46 localhost nova_compute[237281]: 2025-12-06 10:13:46.035 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:13:46 localhost nova_compute[237281]: 2025-12-06 10:13:46.036 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:13:46 localhost nova_compute[237281]: 2025-12-06 10:13:46.036 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:13:46 localhost nova_compute[237281]: 2025-12-06 10:13:46.037 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:13:46 localhost openstack_network_exporter[199751]: ERROR 10:13:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:13:46 localhost openstack_network_exporter[199751]: ERROR 10:13:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:13:46 localhost openstack_network_exporter[199751]: ERROR 10:13:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:13:46 localhost openstack_network_exporter[199751]: ERROR 10:13:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:13:46 localhost openstack_network_exporter[199751]: Dec 6 05:13:46 localhost openstack_network_exporter[199751]: ERROR 10:13:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:13:46 localhost openstack_network_exporter[199751]: Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.203 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.204 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.205 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.205 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.206 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.208 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:13:49 localhost podman[244577]: 2025-12-06 10:13:49.545126477 +0000 UTC m=+0.081613868 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.570 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:13:49 localhost podman[244577]: 2025-12-06 10:13:49.589267959 +0000 UTC m=+0.125755370 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0) Dec 6 05:13:49 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.604 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.604 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.605 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.606 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.606 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.630 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.630 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.631 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.631 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.705 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.782 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.783 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.858 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.859 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.918 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.920 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:13:49 localhost nova_compute[237281]: 2025-12-06 10:13:49.975 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.190 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.192 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12655MB free_disk=387.3087043762207GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.192 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.193 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.277 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.277 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.278 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.345 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.365 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.368 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:13:50 localhost nova_compute[237281]: 2025-12-06 10:13:50.368 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:13:51 localhost nova_compute[237281]: 2025-12-06 10:13:51.364 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:51 localhost nova_compute[237281]: 2025-12-06 10:13:51.365 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:13:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:13:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:13:52 localhost systemd[1]: tmp-crun.hGut2y.mount: Deactivated successfully. Dec 6 05:13:52 localhost podman[244615]: 2025-12-06 10:13:52.557221383 +0000 UTC m=+0.083661882 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:13:52 localhost podman[244615]: 2025-12-06 10:13:52.569253234 +0000 UTC m=+0.095693753 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:13:52 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:13:52 localhost podman[244616]: 2025-12-06 10:13:52.658010942 +0000 UTC m=+0.181977375 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:13:52 localhost podman[244616]: 2025-12-06 10:13:52.671604821 +0000 UTC m=+0.195571284 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 05:13:52 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:13:53 localhost podman[197801]: time="2025-12-06T10:13:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:13:53 localhost podman[197801]: @ - - [06/Dec/2025:10:13:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142324 "" "Go-http-client/1.1" Dec 6 05:13:53 localhost podman[197801]: @ - - [06/Dec/2025:10:13:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15473 "" "Go-http-client/1.1" Dec 6 05:13:53 localhost nova_compute[237281]: 2025-12-06 10:13:53.449 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:53.452 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:13:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:53.453 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:13:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55227 DF PROTO=TCP SPT=55466 DPT=9102 SEQ=2585461267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD59ADA0000000001030307) Dec 6 05:13:54 localhost nova_compute[237281]: 2025-12-06 10:13:54.243 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55228 DF PROTO=TCP SPT=55466 DPT=9102 SEQ=2585461267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD59EC70000000001030307) Dec 6 05:13:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:55.179 219384 INFO oslo.privsep.daemon [None req-8b958208-7638-4807-ae8e-4a99c5d1f094 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmps3tjflrp/privsep.sock']#033[00m Dec 6 05:13:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16641 DF PROTO=TCP SPT=60888 DPT=9102 SEQ=2387719586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD5A1870000000001030307) Dec 6 05:13:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:55.793 219384 INFO oslo.privsep.daemon [None req-8b958208-7638-4807-ae8e-4a99c5d1f094 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 6 05:13:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:55.685 244660 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 05:13:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:55.691 244660 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 05:13:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:55.694 244660 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Dec 6 05:13:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:55.695 244660 INFO oslo.privsep.daemon [-] privsep daemon running as pid 244660#033[00m Dec 6 05:13:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:56.336 219384 INFO oslo.privsep.daemon [None req-8b958208-7638-4807-ae8e-4a99c5d1f094 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpzhoa2njc/privsep.sock']#033[00m Dec 6 05:13:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:56.936 219384 INFO oslo.privsep.daemon [None req-8b958208-7638-4807-ae8e-4a99c5d1f094 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 6 05:13:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:56.827 244669 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 05:13:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:56.832 244669 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 05:13:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:56.835 244669 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Dec 6 05:13:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:56.835 244669 INFO oslo.privsep.daemon [-] privsep daemon running as pid 244669#033[00m Dec 6 05:13:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55229 DF PROTO=TCP SPT=55466 DPT=9102 SEQ=2585461267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD5A6C70000000001030307) Dec 6 05:13:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:57.832 219384 INFO oslo.privsep.daemon [None req-8b958208-7638-4807-ae8e-4a99c5d1f094 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp33pblz9j/privsep.sock']#033[00m Dec 6 05:13:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37607 DF PROTO=TCP SPT=34804 DPT=9102 SEQ=3853356129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD5AB880000000001030307) Dec 6 05:13:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:58.431 219384 INFO oslo.privsep.daemon [None req-8b958208-7638-4807-ae8e-4a99c5d1f094 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 6 05:13:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:58.327 244681 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 05:13:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:58.331 244681 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 05:13:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:58.335 244681 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Dec 6 05:13:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:58.335 244681 INFO oslo.privsep.daemon [-] privsep daemon running as pid 244681#033[00m Dec 6 05:13:59 localhost nova_compute[237281]: 2025-12-06 10:13:59.245 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:13:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:13:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:13:59 localhost podman[244692]: 2025-12-06 10:13:59.558219407 +0000 UTC m=+0.085205771 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 05:13:59 localhost podman[244692]: 2025-12-06 10:13:59.567229094 +0000 UTC m=+0.094215458 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:13:59 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:13:59 localhost podman[244691]: 2025-12-06 10:13:59.65361717 +0000 UTC m=+0.185339149 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:13:59 localhost podman[244691]: 2025-12-06 10:13:59.659281914 +0000 UTC m=+0.191003923 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:13:59 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:13:59 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:13:59.796 219384 INFO neutron.agent.linux.ip_lib [None req-8b958208-7638-4807-ae8e-4a99c5d1f094 - - - - - -] Device tap665356bf-af cannot be used as it has no MAC address#033[00m Dec 6 05:13:59 localhost nova_compute[237281]: 2025-12-06 10:13:59.906 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:59 localhost kernel: device tap665356bf-af entered promiscuous mode Dec 6 05:13:59 localhost NetworkManager[5965]: [1765016039.9155] manager: (tap665356bf-af): new Generic device (/org/freedesktop/NetworkManager/Devices/17) Dec 6 05:13:59 localhost nova_compute[237281]: 2025-12-06 10:13:59.917 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:59 localhost ovn_controller[131684]: 2025-12-06T10:13:59Z|00072|binding|INFO|Claiming lport 665356bf-af5f-43b3-abc5-5e56049a6607 for this chassis. Dec 6 05:13:59 localhost ovn_controller[131684]: 2025-12-06T10:13:59Z|00073|binding|INFO|665356bf-af5f-43b3-abc5-5e56049a6607: Claiming unknown Dec 6 05:13:59 localhost systemd-udevd[244734]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:13:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:59.932 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-980ca774-8741-41be-ba41-098f07f53254', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-980ca774-8741-41be-ba41-098f07f53254', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54bdb71984e64cf09cb4ca41a372818b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2069790-6534-4708-b367-979eef9a3920, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=665356bf-af5f-43b3-abc5-5e56049a6607) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:13:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:59.934 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 665356bf-af5f-43b3-abc5-5e56049a6607 in datapath 980ca774-8741-41be-ba41-098f07f53254 bound to our chassis#033[00m Dec 6 05:13:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:59.936 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4066d799-2a5c-4f59-a3d3-56e51a829a6e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:13:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:59.936 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 980ca774-8741-41be-ba41-098f07f53254, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:13:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:13:59.937 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[38203e09-a660-437f-b544-8ec85f237cd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:13:59 localhost journal[186952]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Dec 6 05:13:59 localhost journal[186952]: hostname: np0005548798.ooo.test Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost ovn_controller[131684]: 2025-12-06T10:13:59Z|00074|binding|INFO|Setting lport 665356bf-af5f-43b3-abc5-5e56049a6607 ovn-installed in OVS Dec 6 05:13:59 localhost ovn_controller[131684]: 2025-12-06T10:13:59Z|00075|binding|INFO|Setting lport 665356bf-af5f-43b3-abc5-5e56049a6607 up in Southbound Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost nova_compute[237281]: 2025-12-06 10:13:59.951 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost journal[186952]: ethtool ioctl error on tap665356bf-af: No such device Dec 6 05:13:59 localhost nova_compute[237281]: 2025-12-06 10:13:59.982 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:00 localhost nova_compute[237281]: 2025-12-06 10:14:00.010 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:00 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:00.456 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:14:00 localhost podman[244806]: Dec 6 05:14:00 localhost podman[244806]: 2025-12-06 10:14:00.866359193 +0000 UTC m=+0.103547275 container create 59579191936cc0810cabf42afb61498518f0bcbf0b7c30f259cdf3d32491243f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980ca774-8741-41be-ba41-098f07f53254, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:14:00 localhost podman[244806]: 2025-12-06 10:14:00.811334525 +0000 UTC m=+0.048522647 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:14:00 localhost systemd[1]: Started libpod-conmon-59579191936cc0810cabf42afb61498518f0bcbf0b7c30f259cdf3d32491243f.scope. Dec 6 05:14:00 localhost systemd[1]: Started libcrun container. Dec 6 05:14:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2892d5a4f278bd4a51a2ec51d7ce5f78e17480b179642f7184f3d1fd7cf7f0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:14:00 localhost podman[244806]: 2025-12-06 10:14:00.952572092 +0000 UTC m=+0.189760174 container init 59579191936cc0810cabf42afb61498518f0bcbf0b7c30f259cdf3d32491243f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980ca774-8741-41be-ba41-098f07f53254, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:14:00 localhost podman[244806]: 2025-12-06 10:14:00.963014665 +0000 UTC m=+0.200202757 container start 59579191936cc0810cabf42afb61498518f0bcbf0b7c30f259cdf3d32491243f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-980ca774-8741-41be-ba41-098f07f53254, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:14:00 localhost dnsmasq[244824]: started, version 2.85 cachesize 150 Dec 6 05:14:00 localhost dnsmasq[244824]: DNS service limited to local subnets Dec 6 05:14:00 localhost dnsmasq[244824]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:14:00 localhost dnsmasq[244824]: warning: no upstream servers configured Dec 6 05:14:00 localhost dnsmasq-dhcp[244824]: DHCP, static leases only on 192.168.199.0, lease time 1d Dec 6 05:14:00 localhost dnsmasq[244824]: read /var/lib/neutron/dhcp/980ca774-8741-41be-ba41-098f07f53254/addn_hosts - 0 addresses Dec 6 05:14:00 localhost dnsmasq-dhcp[244824]: read /var/lib/neutron/dhcp/980ca774-8741-41be-ba41-098f07f53254/host Dec 6 05:14:00 localhost dnsmasq-dhcp[244824]: read /var/lib/neutron/dhcp/980ca774-8741-41be-ba41-098f07f53254/opts Dec 6 05:14:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55230 DF PROTO=TCP SPT=55466 DPT=9102 SEQ=2585461267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD5B6870000000001030307) Dec 6 05:14:01 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:14:01.134 219384 INFO neutron.agent.dhcp.agent [None req-b5f371b6-11f9-4159-bbfb-4735bf23d51c - - - - - -] DHCP configuration for ports {'9d1a96f4-2967-474c-bca7-d973d0956e0e'} is completed#033[00m Dec 6 05:14:04 localhost nova_compute[237281]: 2025-12-06 10:14:04.278 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:06.698 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:14:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:06.698 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:14:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:06.699 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:14:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:14:07 localhost podman[244825]: 2025-12-06 10:14:07.551791462 +0000 UTC m=+0.084905190 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public) Dec 6 05:14:07 localhost podman[244825]: 2025-12-06 10:14:07.56438211 +0000 UTC m=+0.097495848 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm) Dec 6 05:14:07 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:14:09 localhost nova_compute[237281]: 2025-12-06 10:14:09.279 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:09 localhost nova_compute[237281]: 2025-12-06 10:14:09.281 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:09 localhost nova_compute[237281]: 2025-12-06 10:14:09.281 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:14:09 localhost nova_compute[237281]: 2025-12-06 10:14:09.282 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:09 localhost nova_compute[237281]: 2025-12-06 10:14:09.283 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:09 localhost nova_compute[237281]: 2025-12-06 10:14:09.287 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55231 DF PROTO=TCP SPT=55466 DPT=9102 SEQ=2585461267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD5D7870000000001030307) Dec 6 05:14:09 localhost podman[244845]: 2025-12-06 10:14:09.547491651 +0000 UTC m=+0.079261347 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:14:09 localhost podman[244845]: 2025-12-06 10:14:09.554141555 +0000 UTC m=+0.085911291 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:14:09 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:14:14 localhost nova_compute[237281]: 2025-12-06 10:14:14.284 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:16 localhost openstack_network_exporter[199751]: ERROR 10:14:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:14:16 localhost openstack_network_exporter[199751]: ERROR 10:14:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:14:16 localhost openstack_network_exporter[199751]: ERROR 10:14:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:14:16 localhost openstack_network_exporter[199751]: ERROR 10:14:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:14:16 localhost openstack_network_exporter[199751]: Dec 6 05:14:16 localhost openstack_network_exporter[199751]: ERROR 10:14:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:14:16 localhost openstack_network_exporter[199751]: Dec 6 05:14:19 localhost nova_compute[237281]: 2025-12-06 10:14:19.288 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:19 localhost nova_compute[237281]: 2025-12-06 10:14:19.290 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:19 localhost nova_compute[237281]: 2025-12-06 10:14:19.290 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:14:19 localhost nova_compute[237281]: 2025-12-06 10:14:19.290 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:19 localhost nova_compute[237281]: 2025-12-06 10:14:19.315 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:19 localhost nova_compute[237281]: 2025-12-06 10:14:19.316 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:14:20 localhost podman[244869]: 2025-12-06 10:14:20.55178169 +0000 UTC m=+0.080097102 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 6 05:14:20 localhost podman[244869]: 2025-12-06 10:14:20.656599664 +0000 UTC m=+0.184915076 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:14:20 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:14:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:22.992 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:14:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:22.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:14:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:22.997 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f9b8200-4f20-4f12-9d01-0fea43a6c0a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:22.993373', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '561564e8-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': '6bd4f1a82c97fb994532a494513767c055bbd2e8826dc2f35f7cd7bf431e6deb'}]}, 'timestamp': '2025-12-06 10:14:22.998534', '_unique_id': '287af5b649344e7490de3657c427b3c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.000 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.001 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a09741b2-f825-467c-a5f9-5bc729d725d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.001587', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '5615f084-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': 'b8183248f3aa550193344e3da6b0a1b46c945ca1a92e24c1ec9478897271ed47'}]}, 'timestamp': '2025-12-06 10:14:23.002087', '_unique_id': 'e71ae55e75014a048f8b9633af322464'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.003 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.046 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.047 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4607944-a171-4b70-ae81-66efee0a275e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.004232', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '561cdf66-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '562b50b1ed95bafc66f29a852bec7dabedcb2fa703daf8ed6425abe1038b4487'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.004232', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '561cfae6-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '0388b6ae8d0c7db66de47c6b6f785712b6be527b22fad7626d81809dd1f9b0e2'}]}, 'timestamp': '2025-12-06 10:14:23.048220', '_unique_id': 'd8640d689edc49c8a129bc6f10060326'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.051 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.052 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4460a7cc-75f7-4a3a-b6b2-b8ce5a354679', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.051473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '561d915e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '6a793d1f1b0c100c979092d05e7d6d0b1041bacfbe76a1ec6f712e67370b0c67'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.051473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '561da892-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': 'ee26f53b6a9f01169a5eed2e5feef949fda8a4d5c2f6864f0ca9c7ec14ca38fc'}]}, 'timestamp': '2025-12-06 10:14:23.052627', '_unique_id': '618bc27b6dad4c6a9bf817e23000b0e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.053 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.072 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.073 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aedd578-03d2-48dd-a95d-e778e065ca24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.055581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5620c8ce-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.265739616, 'message_signature': '1a7fd89bd21a276d5873d9a21d5e7254fef47f8205b017e52f10d73d70fd9e7b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.055581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56210316-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.265739616, 'message_signature': 'a3e1de66380ec6ca6207e72bc3395777b4648b85358029450e48923f1b3d9b17'}]}, 'timestamp': '2025-12-06 10:14:23.074643', '_unique_id': '9f1b51673c494121bb4a14b621fe915e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.077 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.077 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53385055-ee11-4f7a-a21b-0500e22f81df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.077081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5621738c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.265739616, 'message_signature': '5997753b503a30d0ced5550d9a847eb06e73520807d7941ae7de3f8d845ce9f8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.077081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56217ecc-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.265739616, 'message_signature': '3b1d14363cc9e2043d9707f998d40c8ad2f6df3d9795a3869a66016c11bd3814'}]}, 'timestamp': '2025-12-06 10:14:23.077682', '_unique_id': '3bdff98f1f1e44479a7592942780d960'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.078 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.079 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.095 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 16720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1170c92c-bc20-4bb4-b092-c326c08cc4c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16720000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:14:23.079154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '56243d10-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.30506852, 'message_signature': 'c116e49cae43ad63b9a870bfdd3ac49a65ef50d5b6415af8d5a9b4e5ec90fc99'}]}, 'timestamp': '2025-12-06 10:14:23.095832', '_unique_id': '3258eb8567554ea2820aef43b7757fd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.097 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.098 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.099 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad5167d5-c9b8-485f-8e6d-8509004ca2ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:14:23.098919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5624d3d8-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.30506852, 'message_signature': '2bfbc57a670736f6b9ab319e2291a428b3dba5f43ac662ce6bafd690f89d7559'}]}, 'timestamp': '2025-12-06 10:14:23.099631', '_unique_id': '090280bf06534c0bb59d60382f932bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b57ea889-8191-42ad-bd1c-2e8a54331b3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.102315', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '56255128-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': '1f3d4bb38d8cf18fb2f1c8f8b0eba96cc330f25a22fdb432534ac1cc011c202e'}]}, 'timestamp': '2025-12-06 10:14:23.102886', '_unique_id': '9fb44f7331df4a3090f2bff6372ed4c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.105 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2dc9510-3914-481a-8f9a-e60798f0291b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.105344', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '5625c6c6-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': '8341f93d84f12c4471c7b7cda990699bbc1601ea424b32263438ac0d2597d365'}]}, 'timestamp': '2025-12-06 10:14:23.105903', '_unique_id': '59e22d1351d74f46aa3fb191b958f85c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.108 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec3624bf-a4c9-41ac-b3ab-5b8837433155', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.108616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5626477c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '0b15a7f38b340233ceeec3c5ba12befe47045915564ab9891015d6d3bd944698'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.108616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56265c3a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': 'da2e06066a7d33da4e03b3d8fb91377fe2a0650ec81d52f9d51316c2b99dd036'}]}, 'timestamp': '2025-12-06 10:14:23.109677', '_unique_id': 'a3736c4dcfc342a1b6c6ef375c0a42fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dba8675-535f-46c5-8bf1-3a3455230b8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.112471', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5626df20-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': 'e7200a2a71dcbc43d14349341175095c4b0ec071c630a77d48c020d86ce1fb2e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.112471', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5626f83e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '7dae262247e9748c122f6136ea20eb326a360a7934a4762117c14a695b3db4ca'}]}, 'timestamp': '2025-12-06 10:14:23.113736', '_unique_id': '3c1c7a783a9f47afaa6f2b95d0ed3221'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a51424c6-63da-44de-a491-0b3e500dd554', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.117167', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '56279992-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': '330440e687472d7897201e65b7534fcc6e5492d73adcf83fea44809d732e48d7'}]}, 'timestamp': '2025-12-06 10:14:23.117967', '_unique_id': '469e11914010414297ad4661d80a1060'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f5b99b8-0532-46f0-8dd2-2dad629a1dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.120940', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '56282d6c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': '2e25ffa415d233f7e5fcc6957b460ab4e5539bfe61bcb57e6bb164e3e0e03889'}]}, 'timestamp': '2025-12-06 10:14:23.121600', '_unique_id': '6c7dabe917a940139c15eca19c9e0ee2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.124 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a85ddb61-b7b7-4a97-a8ae-040c20f5a6e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.124289', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '5628aa9e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': 'e6ed824fe746cd2f87264181ad758209e6aa6e8ac5a4f63c253d7740de955f86'}]}, 'timestamp': '2025-12-06 10:14:23.124801', '_unique_id': '6604b73155a74c978900e95c8a78517f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.127 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b5eed5a-0b0e-4e67-8168-5eca09819a66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.127629', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '56292e4c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': 'd1f15822b1cb6448fb7e46b315a7467755d50fc3631940d75865aaf6f6ad18a9'}]}, 'timestamp': '2025-12-06 10:14:23.128214', '_unique_id': 'ea2c7bfeb0d34d37808a823fbc0d7993'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eca81844-8f3e-4ad9-bbef-123b56638fa6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.130417', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '562995a8-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': 'd37ac11d97c3bc91e4b5e425fc68acac4e50d5a484237a8f35bcf15fa004de87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.130417', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5629a00c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '25fe76bb15b7c64ef7ed571c996f88b37a76e37fad0641728c6ff1029b7c4b53'}]}, 'timestamp': '2025-12-06 10:14:23.130981', '_unique_id': '8587d4b23ce7454689aa3cebce162504'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c9ba935-e539-468e-9582-3e12b1cb30fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.132392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5629e2c4-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '2197266010af180bb687e2efa12eca795efeffe4dade06d97bd4c84cb22f9d2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.132392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5629ed1e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.21433912, 'message_signature': '89e1b5f15a1cb683e676ef9c0c0b3a5654b38d946d9bddccf95bbdc3a88daff6'}]}, 'timestamp': '2025-12-06 10:14:23.132953', '_unique_id': '793679a49b28414394723e71c38ba8a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f59287-d783-4825-9b16-651789ee996a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.134612', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '562a3990-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': '4244f1d8778de0a909e2b3efb38278775b337064460dc073f3e57d3a5919d7a2'}]}, 'timestamp': '2025-12-06 10:14:23.134931', '_unique_id': '7624c112a92d4b11a1d47276485594fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.136 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.136 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0a85f8b-02fe-4545-b2b3-007c722a0ecf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:14:23.136293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '562a7b44-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.265739616, 'message_signature': 'b21e0f4e14df1ec482329f23e6ca7031c091ec2502e2c242b326cf9c44126377'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:14:23.136293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '562a85bc-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.265739616, 'message_signature': 'c85cea8401524d07a3ccd620a97d02b1f24ee7a25929e485e1c2174466290a1a'}]}, 'timestamp': '2025-12-06 10:14:23.136838', '_unique_id': '3bc3d133b7734395bc3d9d830949055b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.137 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.138 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdc7da3e-4504-43f7-9efd-93c448966c82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:14:23.138398', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '562acd6a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12450.203478835, 'message_signature': '87fcabea821ef86651cd87e85fc5fedf7d125baaca7309e7790701a98e40bf35'}]}, 'timestamp': '2025-12-06 10:14:23.138690', '_unique_id': '729fb6e1178d4721abc115514e7e14ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:14:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:14:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:14:23 localhost podman[197801]: time="2025-12-06T10:14:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:14:23 localhost podman[197801]: @ - - [06/Dec/2025:10:14:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:14:23 localhost podman[197801]: @ - - [06/Dec/2025:10:14:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15941 "" "Go-http-client/1.1" Dec 6 05:14:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:14:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:14:23 localhost podman[244894]: 2025-12-06 10:14:23.517252406 +0000 UTC m=+0.055106520 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:14:23 localhost podman[244894]: 2025-12-06 10:14:23.55302162 +0000 UTC m=+0.090875734 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:14:23 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:14:23 localhost podman[244895]: 2025-12-06 10:14:23.60359158 +0000 UTC m=+0.135551842 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible) Dec 6 05:14:23 localhost podman[244895]: 2025-12-06 10:14:23.612160204 +0000 UTC m=+0.144120496 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 05:14:23 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:14:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60090 DF PROTO=TCP SPT=47542 DPT=9102 SEQ=271983354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD610090000000001030307) Dec 6 05:14:24 localhost nova_compute[237281]: 2025-12-06 10:14:24.317 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:24 localhost nova_compute[237281]: 2025-12-06 10:14:24.319 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:24 localhost nova_compute[237281]: 2025-12-06 10:14:24.319 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:14:24 localhost nova_compute[237281]: 2025-12-06 10:14:24.320 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:24 localhost nova_compute[237281]: 2025-12-06 10:14:24.339 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:24 localhost nova_compute[237281]: 2025-12-06 10:14:24.340 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60091 DF PROTO=TCP SPT=47542 DPT=9102 SEQ=271983354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD614070000000001030307) Dec 6 05:14:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55232 DF PROTO=TCP SPT=55466 DPT=9102 SEQ=2585461267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD617870000000001030307) Dec 6 05:14:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60092 DF PROTO=TCP SPT=47542 DPT=9102 SEQ=271983354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD61C070000000001030307) Dec 6 05:14:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16642 DF PROTO=TCP SPT=60888 DPT=9102 SEQ=2387719586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD61F870000000001030307) Dec 6 05:14:29 localhost nova_compute[237281]: 2025-12-06 10:14:29.341 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:29 localhost nova_compute[237281]: 2025-12-06 10:14:29.342 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:29 localhost nova_compute[237281]: 2025-12-06 10:14:29.343 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:14:29 localhost nova_compute[237281]: 2025-12-06 10:14:29.343 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:29 localhost nova_compute[237281]: 2025-12-06 10:14:29.343 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:29 localhost nova_compute[237281]: 2025-12-06 10:14:29.346 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:29 localhost ovn_controller[131684]: 2025-12-06T10:14:29Z|00076|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Dec 6 05:14:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:14:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:14:30 localhost podman[244934]: 2025-12-06 10:14:30.547628696 +0000 UTC m=+0.081116133 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:14:30 localhost podman[244935]: 2025-12-06 10:14:30.605366307 +0000 UTC m=+0.135374567 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:14:30 localhost podman[244935]: 2025-12-06 10:14:30.620671739 +0000 UTC m=+0.150679969 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 05:14:30 localhost podman[244934]: 2025-12-06 10:14:30.63302333 +0000 UTC m=+0.166510777 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:14:30 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:14:30 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:14:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60093 DF PROTO=TCP SPT=47542 DPT=9102 SEQ=271983354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD62BC80000000001030307) Dec 6 05:14:34 localhost nova_compute[237281]: 2025-12-06 10:14:34.343 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:36 localhost nova_compute[237281]: 2025-12-06 10:14:36.751 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:36.753 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:14:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:36.754 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:14:37 localhost nova_compute[237281]: 2025-12-06 10:14:37.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:14:38 localhost podman[244971]: 2025-12-06 10:14:38.546697191 +0000 UTC m=+0.080494575 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 05:14:38 localhost podman[244971]: 2025-12-06 10:14:38.584525189 +0000 UTC m=+0.118322533 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:14:38 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60094 DF PROTO=TCP SPT=47542 DPT=9102 SEQ=271983354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD64B870000000001030307) Dec 6 05:14:39 localhost nova_compute[237281]: 2025-12-06 10:14:39.375 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:14:40 localhost podman[244992]: 2025-12-06 10:14:40.55281334 +0000 UTC m=+0.081712592 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:14:40 localhost podman[244992]: 2025-12-06 10:14:40.588325156 +0000 UTC m=+0.117224368 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:14:40 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:14:41 localhost nova_compute[237281]: 2025-12-06 10:14:41.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:42 localhost nova_compute[237281]: 2025-12-06 10:14:42.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:43 localhost nova_compute[237281]: 2025-12-06 10:14:43.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:43 localhost nova_compute[237281]: 2025-12-06 10:14:43.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:14:44 localhost nova_compute[237281]: 2025-12-06 10:14:44.376 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:45 localhost nova_compute[237281]: 2025-12-06 10:14:45.882 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:45 localhost nova_compute[237281]: 2025-12-06 10:14:45.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:46 localhost openstack_network_exporter[199751]: ERROR 10:14:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:14:46 localhost openstack_network_exporter[199751]: ERROR 10:14:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:14:46 localhost openstack_network_exporter[199751]: ERROR 10:14:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:14:46 localhost openstack_network_exporter[199751]: ERROR 10:14:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:14:46 localhost openstack_network_exporter[199751]: Dec 6 05:14:46 localhost openstack_network_exporter[199751]: ERROR 10:14:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:14:46 localhost openstack_network_exporter[199751]: Dec 6 05:14:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:46.756 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:14:46 localhost nova_compute[237281]: 2025-12-06 10:14:46.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:46 localhost nova_compute[237281]: 2025-12-06 10:14:46.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:14:46 localhost nova_compute[237281]: 2025-12-06 10:14:46.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:14:47 localhost nova_compute[237281]: 2025-12-06 10:14:47.399 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:14:47 localhost nova_compute[237281]: 2025-12-06 10:14:47.400 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:14:47 localhost nova_compute[237281]: 2025-12-06 10:14:47.400 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:14:47 localhost nova_compute[237281]: 2025-12-06 10:14:47.401 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:14:49 localhost nova_compute[237281]: 2025-12-06 10:14:49.380 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:49 localhost nova_compute[237281]: 2025-12-06 10:14:49.382 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:14:49 localhost nova_compute[237281]: 2025-12-06 10:14:49.382 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:14:49 localhost nova_compute[237281]: 2025-12-06 10:14:49.383 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:49 localhost nova_compute[237281]: 2025-12-06 10:14:49.389 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:49 localhost nova_compute[237281]: 2025-12-06 10:14:49.390 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:14:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:14:51 localhost systemd[1]: tmp-crun.SlGgKK.mount: Deactivated successfully. Dec 6 05:14:51 localhost podman[245017]: 2025-12-06 10:14:51.555090096 +0000 UTC m=+0.090005167 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:14:51 localhost podman[245017]: 2025-12-06 10:14:51.623342212 +0000 UTC m=+0.158257283 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:14:51 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:14:52 localhost nova_compute[237281]: 2025-12-06 10:14:52.823 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:53 localhost podman[197801]: time="2025-12-06T10:14:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:14:53 localhost podman[197801]: @ - - [06/Dec/2025:10:14:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:14:53 localhost podman[197801]: @ - - [06/Dec/2025:10:14:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15956 "" "Go-http-client/1.1" Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.730 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.751 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.751 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.752 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.753 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.778 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.778 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.779 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.779 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.875 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.948 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:14:53 localhost nova_compute[237281]: 2025-12-06 10:14:53.950 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:14:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27580 DF PROTO=TCP SPT=57222 DPT=9102 SEQ=3674109737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD685390000000001030307) Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.028 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.029 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.102 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.103 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.160 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.428 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.441 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.443 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12379MB free_disk=387.300724029541GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.443 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.444 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:14:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:14:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:14:54 localhost podman[245053]: 2025-12-06 10:14:54.558507213 +0000 UTC m=+0.091808193 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:14:54 localhost podman[245053]: 2025-12-06 10:14:54.593734149 +0000 UTC m=+0.127035089 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:14:54 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:14:54 localhost podman[245054]: 2025-12-06 10:14:54.616777701 +0000 UTC m=+0.141692573 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible) Dec 6 05:14:54 localhost podman[245054]: 2025-12-06 10:14:54.654627708 +0000 UTC m=+0.179542590 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:14:54 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.714 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.715 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.716 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.813 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.838 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.841 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:14:54 localhost nova_compute[237281]: 2025-12-06 10:14:54.842 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:14:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27581 DF PROTO=TCP SPT=57222 DPT=9102 SEQ=3674109737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD689480000000001030307) Dec 6 05:14:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60095 DF PROTO=TCP SPT=47542 DPT=9102 SEQ=271983354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD68B870000000001030307) Dec 6 05:14:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27582 DF PROTO=TCP SPT=57222 DPT=9102 SEQ=3674109737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD691470000000001030307) Dec 6 05:14:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55233 DF PROTO=TCP SPT=55466 DPT=9102 SEQ=2585461267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD695870000000001030307) Dec 6 05:14:59 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:14:59.256 219384 INFO neutron.agent.linux.ip_lib [None req-54fb1d34-5408-4a87-8c08-f94ad5bcffe5 - - - - - -] Device tape48b7399-3b cannot be used as it has no MAC address#033[00m Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.312 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:59 localhost kernel: device tape48b7399-3b entered promiscuous mode Dec 6 05:14:59 localhost ovn_controller[131684]: 2025-12-06T10:14:59Z|00077|binding|INFO|Claiming lport e48b7399-3b56-4775-9776-caf16d42ea9c for this chassis. Dec 6 05:14:59 localhost ovn_controller[131684]: 2025-12-06T10:14:59Z|00078|binding|INFO|e48b7399-3b56-4775-9776-caf16d42ea9c: Claiming unknown Dec 6 05:14:59 localhost NetworkManager[5965]: [1765016099.3219] manager: (tape48b7399-3b): new Generic device (/org/freedesktop/NetworkManager/Devices/18) Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.320 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:59 localhost systemd-udevd[245106]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:14:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:59.342 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28349eaf-1dbf-4bc7-ae61-616696dcb1a3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e48b7399-3b56-4775-9776-caf16d42ea9c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:14:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:59.344 137259 INFO neutron.agent.ovn.metadata.agent [-] Port e48b7399-3b56-4775-9776-caf16d42ea9c in datapath 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 bound to our chassis#033[00m Dec 6 05:14:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:59.346 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ed911aa4-8580-4197-82cf-abe034f34c56 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:14:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:59.346 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:14:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:14:59.348 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[70ef1cb2-ac77-4fec-a8e6-27b42908ae9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost ovn_controller[131684]: 2025-12-06T10:14:59Z|00079|binding|INFO|Setting lport e48b7399-3b56-4775-9776-caf16d42ea9c ovn-installed in OVS Dec 6 05:14:59 localhost ovn_controller[131684]: 2025-12-06T10:14:59Z|00080|binding|INFO|Setting lport e48b7399-3b56-4775-9776-caf16d42ea9c up in Southbound Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.359 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost journal[186952]: ethtool ioctl error on tape48b7399-3b: No such device Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.399 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.425 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.430 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.433 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:14:59 localhost nova_compute[237281]: 2025-12-06 10:14:59.965 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:00 localhost podman[245179]: Dec 6 05:15:00 localhost podman[245179]: 2025-12-06 10:15:00.415356421 +0000 UTC m=+0.095108565 container create 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:15:00 localhost systemd[1]: Started libpod-conmon-0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0.scope. Dec 6 05:15:00 localhost podman[245179]: 2025-12-06 10:15:00.367364609 +0000 UTC m=+0.047116773 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:15:00 localhost systemd[1]: Started libcrun container. Dec 6 05:15:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05a6bbf6a87d11a772fb4b237ab8126c3b1907e544497438af13e31f78927959/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:15:00 localhost podman[245179]: 2025-12-06 10:15:00.499142635 +0000 UTC m=+0.178894769 container init 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:15:00 localhost podman[245179]: 2025-12-06 10:15:00.510129894 +0000 UTC m=+0.189882018 container start 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:15:00 localhost dnsmasq[245197]: started, version 2.85 cachesize 150 Dec 6 05:15:00 localhost dnsmasq[245197]: DNS service limited to local subnets Dec 6 05:15:00 localhost dnsmasq[245197]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:15:00 localhost dnsmasq[245197]: warning: no upstream servers configured Dec 6 05:15:00 localhost dnsmasq-dhcp[245197]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:15:00 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 0 addresses Dec 6 05:15:00 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:15:00 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:15:00 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:00.678 219384 INFO neutron.agent.dhcp.agent [None req-08c3ca68-a591-45df-8b01-328c8fbfe793 - - - - - -] DHCP configuration for ports {'004362b4-c291-41f5-bfb9-f0d2dbd76d4a'} is completed#033[00m Dec 6 05:15:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27583 DF PROTO=TCP SPT=57222 DPT=9102 SEQ=3674109737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD6A1070000000001030307) Dec 6 05:15:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:15:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:15:01 localhost podman[245198]: 2025-12-06 10:15:01.550424018 +0000 UTC m=+0.082647751 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 05:15:01 localhost podman[245198]: 2025-12-06 10:15:01.584330224 +0000 UTC m=+0.116553967 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 05:15:01 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:15:01 localhost systemd[1]: tmp-crun.RVqySX.mount: Deactivated successfully. Dec 6 05:15:01 localhost podman[245199]: 2025-12-06 10:15:01.619265851 +0000 UTC m=+0.147859912 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute) Dec 6 05:15:01 localhost podman[245199]: 2025-12-06 10:15:01.630343693 +0000 UTC m=+0.158937724 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:15:01 localhost nova_compute[237281]: 2025-12-06 10:15:01.638 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:01 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:15:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:03.904 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:03Z, description=, device_id=6acc443d-d6f0-4d6d-ae69-6473d2e6b47f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7ddd7a97-9ea8-483e-a9e6-e3d44e77f02e, ip_allocation=immediate, mac_address=fa:16:3e:c9:57:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:54Z, description=, dns_domain=, id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1133610127-network, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39737, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=215, status=ACTIVE, subnets=['4a44c615-5dac-4975-8269-16f9f546494f'], tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:14:56Z, vlan_transparent=None, network_id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, port_security_enabled=False, project_id=d4ffab9068e64ee89c49785c5f76ecd3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=258, status=DOWN, tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:15:03Z on network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9#033[00m Dec 6 05:15:03 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.204 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=49415 DF PROTO=TCP SPT=64326 DPT=19885 SEQ=1422269289 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A577CDECA000000000103030A) Dec 6 05:15:04 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 1 addresses Dec 6 05:15:04 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:15:04 localhost podman[245254]: 2025-12-06 10:15:04.18872639 +0000 UTC m=+0.060904310 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:15:04 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:15:04 localhost nova_compute[237281]: 2025-12-06 10:15:04.432 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:04 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:04.445 219384 INFO neutron.agent.dhcp.agent [None req-05ae3f14-5ef8-4685-9bd2-7c5a66593ac1 - - - - - -] DHCP configuration for ports {'7ddd7a97-9ea8-483e-a9e6-e3d44e77f02e'} is completed#033[00m Dec 6 05:15:04 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.204 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=14819 DF PROTO=TCP SPT=64336 DPT=19885 SEQ=3900258559 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A577CE2B6000000000103030A) Dec 6 05:15:05 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.204 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=14820 DF PROTO=TCP SPT=64336 DPT=19885 SEQ=3900258559 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A577CE6B9000000000103030A) Dec 6 05:15:05 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:05.983 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:03Z, description=, device_id=6acc443d-d6f0-4d6d-ae69-6473d2e6b47f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7ddd7a97-9ea8-483e-a9e6-e3d44e77f02e, ip_allocation=immediate, mac_address=fa:16:3e:c9:57:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:54Z, description=, dns_domain=, id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1133610127-network, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39737, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=215, status=ACTIVE, subnets=['4a44c615-5dac-4975-8269-16f9f546494f'], tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:14:56Z, vlan_transparent=None, network_id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, port_security_enabled=False, project_id=d4ffab9068e64ee89c49785c5f76ecd3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=258, status=DOWN, tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:15:03Z on network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9#033[00m Dec 6 05:15:06 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.204 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=36508 DF PROTO=TCP SPT=64366 DPT=19885 SEQ=3923594642 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A577CE70F000000000103030A) Dec 6 05:15:06 localhost systemd[1]: tmp-crun.Pzy4GG.mount: Deactivated successfully. Dec 6 05:15:06 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 1 addresses Dec 6 05:15:06 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:15:06 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:15:06 localhost podman[245292]: 2025-12-06 10:15:06.236208495 +0000 UTC m=+0.073560050 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:15:06 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:06.512 219384 INFO neutron.agent.dhcp.agent [None req-8d0303e2-bef5-42fe-86ce-ff0c6117ac17 - - - - - -] DHCP configuration for ports {'7ddd7a97-9ea8-483e-a9e6-e3d44e77f02e'} is completed#033[00m Dec 6 05:15:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:06.698 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:06.699 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:06.700 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:07 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=167.94.138.204 DST=38.129.56.147 LEN=60 TOS=0x08 PREC=0x40 TTL=52 ID=36509 DF PROTO=TCP SPT=64366 DPT=19885 SEQ=3923594642 ACK=0 WINDOW=21900 RES=0x00 SYN URGP=0 OPT (020405B40402080A577CEAF9000000000103030A) Dec 6 05:15:07 localhost sshd[245312]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27584 DF PROTO=TCP SPT=57222 DPT=9102 SEQ=3674109737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD6C1870000000001030307) Dec 6 05:15:09 localhost nova_compute[237281]: 2025-12-06 10:15:09.435 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:15:09 localhost podman[245314]: 2025-12-06 10:15:09.553895268 +0000 UTC m=+0.084965823 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 05:15:09 localhost podman[245314]: 2025-12-06 10:15:09.571238353 +0000 UTC m=+0.102308868 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=) Dec 6 05:15:09 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:15:10 localhost nova_compute[237281]: 2025-12-06 10:15:10.458 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:15:11 localhost podman[245335]: 2025-12-06 10:15:11.557434508 +0000 UTC m=+0.093999432 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:15:11 localhost podman[245335]: 2025-12-06 10:15:11.570128409 +0000 UTC m=+0.106693293 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:15:11 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:15:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:12.899 219384 INFO neutron.agent.linux.ip_lib [None req-135005e9-4fbf-4414-8c3b-e73ef5c699b4 - - - - - -] Device tapc4f64f5d-e8 cannot be used as it has no MAC address#033[00m Dec 6 05:15:12 localhost nova_compute[237281]: 2025-12-06 10:15:12.923 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:12 localhost kernel: device tapc4f64f5d-e8 entered promiscuous mode Dec 6 05:15:12 localhost nova_compute[237281]: 2025-12-06 10:15:12.937 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:12 localhost ovn_controller[131684]: 2025-12-06T10:15:12Z|00081|binding|INFO|Claiming lport c4f64f5d-e8fe-4f9a-8027-e9ea7228536e for this chassis. Dec 6 05:15:12 localhost ovn_controller[131684]: 2025-12-06T10:15:12Z|00082|binding|INFO|c4f64f5d-e8fe-4f9a-8027-e9ea7228536e: Claiming unknown Dec 6 05:15:12 localhost NetworkManager[5965]: [1765016112.9399] manager: (tapc4f64f5d-e8): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Dec 6 05:15:12 localhost systemd-udevd[245368]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:15:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:12.956 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9e292c11-f020-42d8-9e52-a14ca36d70ab', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e292c11-f020-42d8-9e52-a14ca36d70ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f32feb968b74693a394964324f981bf', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e98fa4-639b-44b6-8829-27eb9ee109d4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c4f64f5d-e8fe-4f9a-8027-e9ea7228536e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:15:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:12.958 137259 INFO neutron.agent.ovn.metadata.agent [-] Port c4f64f5d-e8fe-4f9a-8027-e9ea7228536e in datapath 9e292c11-f020-42d8-9e52-a14ca36d70ab bound to our chassis#033[00m Dec 6 05:15:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:12.960 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9e292c11-f020-42d8-9e52-a14ca36d70ab or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:15:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:12.961 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[7da1c533-13db-4dde-8ce8-e22e2bd0a39c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:12 localhost ovn_controller[131684]: 2025-12-06T10:15:12Z|00083|binding|INFO|Setting lport c4f64f5d-e8fe-4f9a-8027-e9ea7228536e ovn-installed in OVS Dec 6 05:15:12 localhost ovn_controller[131684]: 2025-12-06T10:15:12Z|00084|binding|INFO|Setting lport c4f64f5d-e8fe-4f9a-8027-e9ea7228536e up in Southbound Dec 6 05:15:12 localhost nova_compute[237281]: 2025-12-06 10:15:12.971 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:12 localhost nova_compute[237281]: 2025-12-06 10:15:12.972 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:12 localhost journal[186952]: ethtool ioctl error on tapc4f64f5d-e8: No such device Dec 6 05:15:13 localhost nova_compute[237281]: 2025-12-06 10:15:13.012 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:13 localhost nova_compute[237281]: 2025-12-06 10:15:13.037 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:13 localhost podman[245439]: Dec 6 05:15:13 localhost podman[245439]: 2025-12-06 10:15:13.869110093 +0000 UTC m=+0.091401681 container create e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:15:13 localhost systemd[1]: Started libpod-conmon-e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e.scope. Dec 6 05:15:13 localhost podman[245439]: 2025-12-06 10:15:13.825170357 +0000 UTC m=+0.047462025 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:15:13 localhost systemd[1]: Started libcrun container. Dec 6 05:15:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43cf96b82e564114a6015e338ea9afa82422eaed58aefc2b2331a1d87142e145/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:15:13 localhost podman[245439]: 2025-12-06 10:15:13.94876001 +0000 UTC m=+0.171051588 container init e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:15:13 localhost podman[245439]: 2025-12-06 10:15:13.954400495 +0000 UTC m=+0.176692073 container start e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:15:13 localhost dnsmasq[245457]: started, version 2.85 cachesize 150 Dec 6 05:15:13 localhost dnsmasq[245457]: DNS service limited to local subnets Dec 6 05:15:13 localhost dnsmasq[245457]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:15:13 localhost dnsmasq[245457]: warning: no upstream servers configured Dec 6 05:15:13 localhost dnsmasq-dhcp[245457]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:15:13 localhost dnsmasq[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/addn_hosts - 0 addresses Dec 6 05:15:13 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/host Dec 6 05:15:13 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/opts Dec 6 05:15:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:14.144 219384 INFO neutron.agent.dhcp.agent [None req-ed30019c-dd57-4e96-b883-75e87e4a9a20 - - - - - -] DHCP configuration for ports {'9e0536f3-86e8-49c0-9c2f-050b47a8e620'} is completed#033[00m Dec 6 05:15:14 localhost nova_compute[237281]: 2025-12-06 10:15:14.443 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:15 localhost nova_compute[237281]: 2025-12-06 10:15:15.581 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:16 localhost openstack_network_exporter[199751]: ERROR 10:15:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:15:16 localhost openstack_network_exporter[199751]: ERROR 10:15:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:15:16 localhost openstack_network_exporter[199751]: ERROR 10:15:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:15:16 localhost openstack_network_exporter[199751]: ERROR 10:15:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:15:16 localhost openstack_network_exporter[199751]: Dec 6 05:15:16 localhost openstack_network_exporter[199751]: ERROR 10:15:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:15:16 localhost openstack_network_exporter[199751]: Dec 6 05:15:17 localhost nova_compute[237281]: 2025-12-06 10:15:17.073 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:15:18.149 2 INFO neutron.agent.securitygroups_rpc [None req-b73fa1d0-170b-440c-b77e-b62b5460078d 383605419be14147bc2ebf82a90ac1b9 caceedbf61904a5eaba72910f7a24db1 - - default default] Security group member updated ['eb2751e0-e02b-43b7-b38b-b824ce1a45d2']#033[00m Dec 6 05:15:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:18.487 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:18Z, description=, device_id=dcec9cd0-f1dd-413b-8b63-4e541e0b290c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=eb2fbf01-5339-42cf-8b38-7debd967c876, ip_allocation=immediate, mac_address=fa:16:3e:1d:97:4f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:10Z, description=, dns_domain=, id=9e292c11-f020-42d8-9e52-a14ca36d70ab, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1067330353-network, port_security_enabled=True, project_id=2f32feb968b74693a394964324f981bf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3424, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=286, status=ACTIVE, subnets=['9e2e59a8-2d3f-4f41-857b-a3972f102c72'], tags=[], tenant_id=2f32feb968b74693a394964324f981bf, updated_at=2025-12-06T10:15:11Z, vlan_transparent=None, network_id=9e292c11-f020-42d8-9e52-a14ca36d70ab, port_security_enabled=False, project_id=2f32feb968b74693a394964324f981bf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=335, status=DOWN, tags=[], tenant_id=2f32feb968b74693a394964324f981bf, updated_at=2025-12-06T10:15:18Z on network 9e292c11-f020-42d8-9e52-a14ca36d70ab#033[00m Dec 6 05:15:18 localhost podman[245476]: 2025-12-06 10:15:18.713566156 +0000 UTC m=+0.063227551 container kill e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:15:18 localhost dnsmasq[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/addn_hosts - 1 addresses Dec 6 05:15:18 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/host Dec 6 05:15:18 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/opts Dec 6 05:15:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:18.949 219384 INFO neutron.agent.dhcp.agent [None req-681078ef-add4-4417-9d59-c91b617bddbe - - - - - -] DHCP configuration for ports {'eb2fbf01-5339-42cf-8b38-7debd967c876'} is completed#033[00m Dec 6 05:15:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:19.422 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:18Z, description=, device_id=dcec9cd0-f1dd-413b-8b63-4e541e0b290c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=eb2fbf01-5339-42cf-8b38-7debd967c876, ip_allocation=immediate, mac_address=fa:16:3e:1d:97:4f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:10Z, description=, dns_domain=, id=9e292c11-f020-42d8-9e52-a14ca36d70ab, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1067330353-network, port_security_enabled=True, project_id=2f32feb968b74693a394964324f981bf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3424, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=286, status=ACTIVE, subnets=['9e2e59a8-2d3f-4f41-857b-a3972f102c72'], tags=[], tenant_id=2f32feb968b74693a394964324f981bf, updated_at=2025-12-06T10:15:11Z, vlan_transparent=None, network_id=9e292c11-f020-42d8-9e52-a14ca36d70ab, port_security_enabled=False, project_id=2f32feb968b74693a394964324f981bf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=335, status=DOWN, tags=[], tenant_id=2f32feb968b74693a394964324f981bf, updated_at=2025-12-06T10:15:18Z on network 9e292c11-f020-42d8-9e52-a14ca36d70ab#033[00m Dec 6 05:15:19 localhost nova_compute[237281]: 2025-12-06 10:15:19.523 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:19 localhost dnsmasq[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/addn_hosts - 1 addresses Dec 6 05:15:19 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/host Dec 6 05:15:19 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/opts Dec 6 05:15:19 localhost podman[245515]: 2025-12-06 10:15:19.630785004 +0000 UTC m=+0.063147170 container kill e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:15:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:19.918 219384 INFO neutron.agent.dhcp.agent [None req-5c542bba-fdec-48ec-aaf6-3e5671c51d06 - - - - - -] DHCP configuration for ports {'eb2fbf01-5339-42cf-8b38-7debd967c876'} is completed#033[00m Dec 6 05:15:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:15:22 localhost podman[245535]: 2025-12-06 10:15:22.533377951 +0000 UTC m=+0.072904910 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2) Dec 6 05:15:22 localhost podman[245535]: 2025-12-06 10:15:22.572065085 +0000 UTC m=+0.111592034 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:15:22 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:15:23 localhost podman[197801]: time="2025-12-06T10:15:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:15:23 localhost podman[197801]: @ - - [06/Dec/2025:10:15:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147798 "" "Go-http-client/1.1" Dec 6 05:15:23 localhost podman[197801]: @ - - [06/Dec/2025:10:15:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16900 "" "Go-http-client/1.1" Dec 6 05:15:23 localhost neutron_sriov_agent[212548]: 2025-12-06 10:15:23.701 2 INFO neutron.agent.securitygroups_rpc [None req-6d621e09-b913-4cc0-b82f-8da13f9c1f0d 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Security group member updated ['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2']#033[00m Dec 6 05:15:23 localhost neutron_sriov_agent[212548]: 2025-12-06 10:15:23.732 2 INFO neutron.agent.securitygroups_rpc [None req-2d17dc4c-6035-49cd-80ea-e69ff552b523 383605419be14147bc2ebf82a90ac1b9 caceedbf61904a5eaba72910f7a24db1 - - default default] Security group member updated ['eb2751e0-e02b-43b7-b38b-b824ce1a45d2']#033[00m Dec 6 05:15:23 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:23.739 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:23Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=51683506-5f1b-40d6-b898-12a44611e923, ip_allocation=immediate, mac_address=fa:16:3e:4d:48:58, name=tempest-parent-745716054, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:54Z, description=, dns_domain=, id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1133610127-network, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39737, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=215, status=ACTIVE, subnets=['4a44c615-5dac-4975-8269-16f9f546494f'], tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:14:56Z, vlan_transparent=None, network_id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2'], standard_attr_id=367, status=DOWN, tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:15:23Z on network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9#033[00m Dec 6 05:15:23 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 2 addresses Dec 6 05:15:23 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:15:23 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:15:23 localhost podman[245575]: 2025-12-06 10:15:23.967790893 +0000 UTC m=+0.061459327 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:15:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59678 DF PROTO=TCP SPT=58538 DPT=9102 SEQ=160817535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD6FA690000000001030307) Dec 6 05:15:24 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:24.234 219384 INFO neutron.agent.dhcp.agent [None req-c3f1d6a8-0b45-4883-abbc-3ecaca616200 - - - - - -] DHCP configuration for ports {'51683506-5f1b-40d6-b898-12a44611e923'} is completed#033[00m Dec 6 05:15:24 localhost nova_compute[237281]: 2025-12-06 10:15:24.524 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:15:24 localhost nova_compute[237281]: 2025-12-06 10:15:24.526 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:15:24 localhost nova_compute[237281]: 2025-12-06 10:15:24.527 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 6 05:15:24 localhost nova_compute[237281]: 2025-12-06 10:15:24.527 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:15:24 localhost nova_compute[237281]: 2025-12-06 10:15:24.562 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:24 localhost nova_compute[237281]: 2025-12-06 10:15:24.563 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 6 05:15:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59679 DF PROTO=TCP SPT=58538 DPT=9102 SEQ=160817535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD6FE870000000001030307) Dec 6 05:15:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:15:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:15:25 localhost podman[245595]: 2025-12-06 10:15:25.559658453 +0000 UTC m=+0.088725557 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:15:25 localhost podman[245596]: 2025-12-06 10:15:25.59098847 +0000 UTC m=+0.116748783 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:15:25 localhost podman[245595]: 2025-12-06 10:15:25.594710145 +0000 UTC m=+0.123777189 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:15:25 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:15:25 localhost podman[245596]: 2025-12-06 10:15:25.625197016 +0000 UTC m=+0.150957359 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:15:25 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:15:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27585 DF PROTO=TCP SPT=57222 DPT=9102 SEQ=3674109737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD701870000000001030307) Dec 6 05:15:26 localhost nova_compute[237281]: 2025-12-06 10:15:26.436 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59680 DF PROTO=TCP SPT=58538 DPT=9102 SEQ=160817535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD706870000000001030307) Dec 6 05:15:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60096 DF PROTO=TCP SPT=47542 DPT=9102 SEQ=271983354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD709870000000001030307) Dec 6 05:15:28 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:28.058 219384 INFO neutron.agent.linux.ip_lib [None req-45685f14-82c3-4840-ad5b-72830acfe2cf - - - - - -] Device tapae282abb-34 cannot be used as it has no MAC address#033[00m Dec 6 05:15:28 localhost nova_compute[237281]: 2025-12-06 10:15:28.076 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:28 localhost kernel: device tapae282abb-34 entered promiscuous mode Dec 6 05:15:28 localhost nova_compute[237281]: 2025-12-06 10:15:28.083 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:28 localhost NetworkManager[5965]: [1765016128.0844] manager: (tapae282abb-34): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Dec 6 05:15:28 localhost ovn_controller[131684]: 2025-12-06T10:15:28Z|00085|binding|INFO|Claiming lport ae282abb-349f-41f0-995b-0dbe9188b00f for this chassis. Dec 6 05:15:28 localhost ovn_controller[131684]: 2025-12-06T10:15:28Z|00086|binding|INFO|ae282abb-349f-41f0-995b-0dbe9188b00f: Claiming unknown Dec 6 05:15:28 localhost systemd-udevd[245649]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:15:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:28.101 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39fd89f7-b191-45fe-9a9a-73b9abccf180, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ae282abb-349f-41f0-995b-0dbe9188b00f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:15:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:28.103 137259 INFO neutron.agent.ovn.metadata.agent [-] Port ae282abb-349f-41f0-995b-0dbe9188b00f in datapath ea5020da-f370-47b0-b6c4-a1f36329a7ad bound to our chassis#033[00m Dec 6 05:15:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:28.105 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ea5020da-f370-47b0-b6c4-a1f36329a7ad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:15:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:28.106 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d45321-f7c6-4d1b-84f0-54f1cb0c9585]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost ovn_controller[131684]: 2025-12-06T10:15:28Z|00087|binding|INFO|Setting lport ae282abb-349f-41f0-995b-0dbe9188b00f ovn-installed in OVS Dec 6 05:15:28 localhost ovn_controller[131684]: 2025-12-06T10:15:28Z|00088|binding|INFO|Setting lport ae282abb-349f-41f0-995b-0dbe9188b00f up in Southbound Dec 6 05:15:28 localhost nova_compute[237281]: 2025-12-06 10:15:28.116 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost journal[186952]: ethtool ioctl error on tapae282abb-34: No such device Dec 6 05:15:28 localhost nova_compute[237281]: 2025-12-06 10:15:28.148 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:28 localhost nova_compute[237281]: 2025-12-06 10:15:28.176 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:29 localhost podman[245721]: Dec 6 05:15:29 localhost podman[245721]: 2025-12-06 10:15:29.031277964 +0000 UTC m=+0.092154683 container create dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 05:15:29 localhost systemd[1]: Started libpod-conmon-dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f.scope. Dec 6 05:15:29 localhost podman[245721]: 2025-12-06 10:15:28.987953188 +0000 UTC m=+0.048829967 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:15:29 localhost systemd[1]: Started libcrun container. Dec 6 05:15:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d12c313be28d53f2adece6783f26a7003d69099d210059d75e16e9fa0a9f1bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:15:29 localhost podman[245721]: 2025-12-06 10:15:29.111444038 +0000 UTC m=+0.172320797 container init dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:15:29 localhost podman[245721]: 2025-12-06 10:15:29.120011342 +0000 UTC m=+0.180888091 container start dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:15:29 localhost dnsmasq[245739]: started, version 2.85 cachesize 150 Dec 6 05:15:29 localhost dnsmasq[245739]: DNS service limited to local subnets Dec 6 05:15:29 localhost dnsmasq[245739]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:15:29 localhost dnsmasq[245739]: warning: no upstream servers configured Dec 6 05:15:29 localhost dnsmasq-dhcp[245739]: DHCP, static leases only on 19.80.0.0, lease time 1d Dec 6 05:15:29 localhost dnsmasq[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/addn_hosts - 0 addresses Dec 6 05:15:29 localhost dnsmasq-dhcp[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/host Dec 6 05:15:29 localhost dnsmasq-dhcp[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/opts Dec 6 05:15:29 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:29.325 219384 INFO neutron.agent.dhcp.agent [None req-156ee92c-d817-48e8-bcf1-27eef7ec593f - - - - - -] DHCP configuration for ports {'70c45e60-e7ea-4bdb-926b-73d2cccf5054'} is completed#033[00m Dec 6 05:15:29 localhost nova_compute[237281]: 2025-12-06 10:15:29.564 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:30 localhost neutron_sriov_agent[212548]: 2025-12-06 10:15:30.076 2 INFO neutron.agent.securitygroups_rpc [None req-e6db8502-a86b-4fc8-9212-b4d38685d9bc 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Security group member updated ['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2']#033[00m Dec 6 05:15:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:30.491 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:29Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=306c6369-38db-4281-af43-1db942f1ab64, ip_allocation=immediate, mac_address=fa:16:3e:6b:db:20, name=tempest-subport-1468155593, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:24Z, description=, dns_domain=, id=ea5020da-f370-47b0-b6c4-a1f36329a7ad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-672914903, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5432, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=376, status=ACTIVE, subnets=['d9a2b2fa-deaa-49f5-abfa-32103cdfd1c7'], tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:15:26Z, vlan_transparent=None, network_id=ea5020da-f370-47b0-b6c4-a1f36329a7ad, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2'], standard_attr_id=393, status=DOWN, tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, updated_at=2025-12-06T10:15:29Z on network ea5020da-f370-47b0-b6c4-a1f36329a7ad#033[00m Dec 6 05:15:30 localhost dnsmasq[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/addn_hosts - 1 addresses Dec 6 05:15:30 localhost dnsmasq-dhcp[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/host Dec 6 05:15:30 localhost podman[245756]: 2025-12-06 10:15:30.718187016 +0000 UTC m=+0.058765624 container kill dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:15:30 localhost dnsmasq-dhcp[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/opts Dec 6 05:15:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:30.995 219384 INFO neutron.agent.dhcp.agent [None req-d6b8947b-38f9-4dec-827d-f2410c77a666 - - - - - -] DHCP configuration for ports {'306c6369-38db-4281-af43-1db942f1ab64'} is completed#033[00m Dec 6 05:15:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59681 DF PROTO=TCP SPT=58538 DPT=9102 SEQ=160817535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD716470000000001030307) Dec 6 05:15:31 localhost nova_compute[237281]: 2025-12-06 10:15:31.593 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:15:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:15:32 localhost systemd[1]: tmp-crun.XtvXeV.mount: Deactivated successfully. Dec 6 05:15:32 localhost podman[245777]: 2025-12-06 10:15:32.61976335 +0000 UTC m=+0.083953441 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125) Dec 6 05:15:32 localhost podman[245778]: 2025-12-06 10:15:32.681568077 +0000 UTC m=+0.142091424 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 05:15:32 localhost podman[245778]: 2025-12-06 10:15:32.697259831 +0000 UTC m=+0.157783178 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 05:15:32 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:15:32 localhost podman[245777]: 2025-12-06 10:15:32.749668968 +0000 UTC m=+0.213859059 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:15:32 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:15:34 localhost nova_compute[237281]: 2025-12-06 10:15:34.600 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:39 localhost nova_compute[237281]: 2025-12-06 10:15:39.603 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59682 DF PROTO=TCP SPT=58538 DPT=9102 SEQ=160817535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD737880000000001030307) Dec 6 05:15:39 localhost nova_compute[237281]: 2025-12-06 10:15:39.859 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:39.861 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:15:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:39.862 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:15:39 localhost nova_compute[237281]: 2025-12-06 10:15:39.974 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:15:40 localhost podman[245815]: 2025-12-06 10:15:40.548525086 +0000 UTC m=+0.083850237 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41) Dec 6 05:15:40 localhost podman[245815]: 2025-12-06 10:15:40.559116642 +0000 UTC m=+0.094441883 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible) Dec 6 05:15:40 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.088 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.089 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.110 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.186 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.187 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.192 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.193 237285 INFO nova.compute.claims [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Claim successful on node np0005548798.ooo.test#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.345 237285 DEBUG nova.compute.provider_tree [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.365 237285 DEBUG nova.scheduler.client.report [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.395 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.396 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.472 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.473 237285 DEBUG nova.network.neutron [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.505 237285 INFO nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.526 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.623 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.625 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.625 237285 INFO nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Creating image(s)#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.626 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "/var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.627 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "/var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.628 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "/var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.628 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:41 localhost nova_compute[237281]: 2025-12-06 10:15:41.629 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:41.864 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:15:42 localhost podman[245835]: 2025-12-06 10:15:42.55476201 +0000 UTC m=+0.087261303 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:15:42 localhost podman[245835]: 2025-12-06 10:15:42.561514538 +0000 UTC m=+0.094013861 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:15:42 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:15:42 localhost nova_compute[237281]: 2025-12-06 10:15:42.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:42 localhost nova_compute[237281]: 2025-12-06 10:15:42.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.416 237285 WARNING oslo_policy.policy [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.416 237285 WARNING oslo_policy.policy [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.420 237285 DEBUG nova.policy [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21240df215a544878bd6d4c5ec47594a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.835 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.908 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.part --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.910 237285 DEBUG nova.virt.images [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] 8eeec8d4-c6be-4c95-9cb2-1a047e96c028 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.911 237285 DEBUG nova.privsep.utils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Dec 6 05:15:43 localhost nova_compute[237281]: 2025-12-06 10:15:43.912 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.part /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.096 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.part /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.converted" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.101 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.175 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.177 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 2.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.200 237285 INFO oslo.privsep.daemon [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp5i9j_2if/privsep.sock']#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.606 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.814 237285 INFO oslo.privsep.daemon [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Spawned new privsep daemon via rootwrap#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.712 245875 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.718 245875 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.722 245875 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.722 245875 INFO oslo.privsep.daemon [-] privsep daemon running as pid 245875#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.892 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:44 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:44.953 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005548798.ooo.test, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:23Z, description=, device_id=83f60b2f-0c03-4557-a484-b92761cfecce, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=tempest-liveautoblockmigrationv225test-server-1908771692, extra_dhcp_opts=[], fixed_ips=[], id=51683506-5f1b-40d6-b898-12a44611e923, ip_allocation=immediate, mac_address=fa:16:3e:4d:48:58, name=tempest-parent-745716054, network_id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2'], standard_attr_id=367, status=DOWN, tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, trunk_details=sub_ports=[], trunk_id=dc442437-9f4f-446d-b004-588e5c07d3e0, updated_at=2025-12-06T10:15:43Z on network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.966 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.967 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.968 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:44 localhost nova_compute[237281]: 2025-12-06 10:15:44.982 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.037 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.039 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3,backing_fmt=raw /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.072 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3,backing_fmt=raw /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.074 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.075 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.128 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.129 237285 DEBUG nova.virt.disk.api [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Checking if we can resize image /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.130 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:45 localhost systemd[1]: tmp-crun.BafvUR.mount: Deactivated successfully. Dec 6 05:15:45 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 2 addresses Dec 6 05:15:45 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:15:45 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:15:45 localhost podman[245905]: 2025-12-06 10:15:45.188195913 +0000 UTC m=+0.071412434 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.189 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.191 237285 DEBUG nova.virt.disk.api [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Cannot resize image /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.193 237285 DEBUG nova.objects.instance [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lazy-loading 'migration_context' on Instance uuid 83f60b2f-0c03-4557-a484-b92761cfecce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.216 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.217 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Ensure instance console log exists: /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.217 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.217 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.218 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:45 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:15:45.376 219384 INFO neutron.agent.dhcp.agent [None req-ec7c01fd-78b9-4213-b97b-79df702c3c25 - - - - - -] DHCP configuration for ports {'51683506-5f1b-40d6-b898-12a44611e923'} is completed#033[00m Dec 6 05:15:45 localhost nova_compute[237281]: 2025-12-06 10:15:45.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:46 localhost nova_compute[237281]: 2025-12-06 10:15:46.005 237285 DEBUG nova.network.neutron [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Successfully updated port: 51683506-5f1b-40d6-b898-12a44611e923 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m Dec 6 05:15:46 localhost nova_compute[237281]: 2025-12-06 10:15:46.025 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:15:46 localhost nova_compute[237281]: 2025-12-06 10:15:46.025 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquired lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:15:46 localhost nova_compute[237281]: 2025-12-06 10:15:46.026 237285 DEBUG nova.network.neutron [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 6 05:15:46 localhost openstack_network_exporter[199751]: ERROR 10:15:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:15:46 localhost openstack_network_exporter[199751]: ERROR 10:15:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:15:46 localhost openstack_network_exporter[199751]: ERROR 10:15:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:15:46 localhost openstack_network_exporter[199751]: ERROR 10:15:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:15:46 localhost openstack_network_exporter[199751]: Dec 6 05:15:46 localhost openstack_network_exporter[199751]: ERROR 10:15:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:15:46 localhost openstack_network_exporter[199751]: Dec 6 05:15:46 localhost nova_compute[237281]: 2025-12-06 10:15:46.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:47 localhost nova_compute[237281]: 2025-12-06 10:15:47.118 237285 DEBUG nova.network.neutron [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Dec 6 05:15:47 localhost nova_compute[237281]: 2025-12-06 10:15:47.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:47 localhost nova_compute[237281]: 2025-12-06 10:15:47.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:15:47 localhost nova_compute[237281]: 2025-12-06 10:15:47.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:15:47 localhost nova_compute[237281]: 2025-12-06 10:15:47.920 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.016 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.017 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.018 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.018 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.273 237285 DEBUG nova.compute.manager [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-changed-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.273 237285 DEBUG nova.compute.manager [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Refreshing instance network info cache due to event network-changed-51683506-5f1b-40d6-b898-12a44611e923. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.274 237285 DEBUG oslo_concurrency.lockutils [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.609 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.788 237285 DEBUG nova.network.neutron [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Updating instance_info_cache with network_info: [{"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.827 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Releasing lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.828 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Instance network_info: |[{"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.845 237285 DEBUG oslo_concurrency.lockutils [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquired lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.846 237285 DEBUG nova.network.neutron [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Refreshing network info cache for port 51683506-5f1b-40d6-b898-12a44611e923 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.852 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Start _get_guest_xml network_info=[{"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:57Z,direct_url=,disk_format='qcow2',id=8eeec8d4-c6be-4c95-9cb2-1a047e96c028,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='47835b89168945138751a4b216280589',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-12-06T10:13:59Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '8eeec8d4-c6be-4c95-9cb2-1a047e96c028'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.862 237285 WARNING nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.866 237285 DEBUG nova.virt.libvirt.host [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Searching host: 'np0005548798.ooo.test' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.867 237285 DEBUG nova.virt.libvirt.host [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.875 237285 DEBUG nova.virt.libvirt.host [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Searching host: 'np0005548798.ooo.test' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.875 237285 DEBUG nova.virt.libvirt.host [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.876 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.876 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:13:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='de034496-40b7-4669-ab81-19110fbda990',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:57Z,direct_url=,disk_format='qcow2',id=8eeec8d4-c6be-4c95-9cb2-1a047e96c028,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='47835b89168945138751a4b216280589',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-12-06T10:13:59Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.877 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.877 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.878 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.878 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.878 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.879 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.879 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.880 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.880 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.880 237285 DEBUG nova.virt.hardware [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.886 237285 DEBUG nova.virt.libvirt.vif [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1908771692',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005548798.ooo.test',hostname='tempest-liveautoblockmigrationv225test-server-1908771692',id=8,image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4ffab9068e64ee89c49785c5f76ecd3',ramdisk_id='',reservation_id='r-w9ig3u00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1195199977',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1195199977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:15:41Z,user_data=None,user_id='21240df215a544878bd6d4c5ec47594a',uuid=83f60b2f-0c03-4557-a484-b92761cfecce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.887 237285 DEBUG nova.network.os_vif_util [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Converting VIF {"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.888 237285 DEBUG nova.network.os_vif_util [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.889 237285 DEBUG nova.objects.instance [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 83f60b2f-0c03-4557-a484-b92761cfecce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.908 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] End _get_guest_xml xml= Dec 6 05:15:49 localhost nova_compute[237281]: 83f60b2f-0c03-4557-a484-b92761cfecce Dec 6 05:15:49 localhost nova_compute[237281]: instance-00000008 Dec 6 05:15:49 localhost nova_compute[237281]: 131072 Dec 6 05:15:49 localhost nova_compute[237281]: 1 Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: tempest-LiveAutoBlockMigrationV225Test-server-1908771692 Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49 Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: 128 Dec 6 05:15:49 localhost nova_compute[237281]: 1 Dec 6 05:15:49 localhost nova_compute[237281]: 0 Dec 6 05:15:49 localhost nova_compute[237281]: 0 Dec 6 05:15:49 localhost nova_compute[237281]: 1 Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: tempest-LiveAutoBlockMigrationV225Test-1195199977-project-member Dec 6 05:15:49 localhost nova_compute[237281]: tempest-LiveAutoBlockMigrationV225Test-1195199977 Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: RDO Dec 6 05:15:49 localhost nova_compute[237281]: OpenStack Compute Dec 6 05:15:49 localhost nova_compute[237281]: 27.5.2-0.20250829104910.6f8decf.el9 Dec 6 05:15:49 localhost nova_compute[237281]: 83f60b2f-0c03-4557-a484-b92761cfecce Dec 6 05:15:49 localhost nova_compute[237281]: 83f60b2f-0c03-4557-a484-b92761cfecce Dec 6 05:15:49 localhost nova_compute[237281]: Virtual Machine Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: hvm Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: /dev/urandom Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: Dec 6 05:15:49 localhost nova_compute[237281]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.909 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Preparing to wait for external event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.909 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.910 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.910 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.911 237285 DEBUG nova.virt.libvirt.vif [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1908771692',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005548798.ooo.test',hostname='tempest-liveautoblockmigrationv225test-server-1908771692',id=8,image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4ffab9068e64ee89c49785c5f76ecd3',ramdisk_id='',reservation_id='r-w9ig3u00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1195199977',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1195199977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:15:41Z,user_data=None,user_id='21240df215a544878bd6d4c5ec47594a',uuid=83f60b2f-0c03-4557-a484-b92761cfecce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.912 237285 DEBUG nova.network.os_vif_util [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Converting VIF {"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.912 237285 DEBUG nova.network.os_vif_util [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.913 237285 DEBUG os_vif [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.914 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.914 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.915 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.918 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.919 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51683506-5f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.919 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap51683506-5f, col_values=(('external_ids', {'iface-id': '51683506-5f1b-40d6-b898-12a44611e923', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4d:48:58', 'vm-uuid': '83f60b2f-0c03-4557-a484-b92761cfecce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:49 localhost nova_compute[237281]: 2025-12-06 10:15:49.996 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:50 localhost nova_compute[237281]: 2025-12-06 10:15:50.001 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:15:50 localhost nova_compute[237281]: 2025-12-06 10:15:50.004 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:50 localhost nova_compute[237281]: 2025-12-06 10:15:50.005 237285 INFO os_vif [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f')#033[00m Dec 6 05:15:50 localhost nova_compute[237281]: 2025-12-06 10:15:50.273 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Dec 6 05:15:50 localhost nova_compute[237281]: 2025-12-06 10:15:50.274 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Dec 6 05:15:50 localhost nova_compute[237281]: 2025-12-06 10:15:50.275 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] No VIF found with MAC fa:16:3e:4d:48:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m Dec 6 05:15:50 localhost nova_compute[237281]: 2025-12-06 10:15:50.276 237285 INFO nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Using config drive#033[00m Dec 6 05:15:51 localhost nova_compute[237281]: 2025-12-06 10:15:51.570 237285 INFO nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Creating config drive at /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk.config#033[00m Dec 6 05:15:51 localhost nova_compute[237281]: 2025-12-06 10:15:51.577 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8hz7k9ul execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:51 localhost nova_compute[237281]: 2025-12-06 10:15:51.711 237285 DEBUG oslo_concurrency.processutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8hz7k9ul" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:51 localhost kernel: device tap51683506-5f entered promiscuous mode Dec 6 05:15:51 localhost NetworkManager[5965]: [1765016151.7844] manager: (tap51683506-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/21) Dec 6 05:15:51 localhost nova_compute[237281]: 2025-12-06 10:15:51.781 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:51 localhost ovn_controller[131684]: 2025-12-06T10:15:51Z|00089|binding|INFO|Claiming lport 51683506-5f1b-40d6-b898-12a44611e923 for this chassis. Dec 6 05:15:51 localhost ovn_controller[131684]: 2025-12-06T10:15:51Z|00090|binding|INFO|51683506-5f1b-40d6-b898-12a44611e923: Claiming fa:16:3e:4d:48:58 10.100.0.13 Dec 6 05:15:51 localhost ovn_controller[131684]: 2025-12-06T10:15:51Z|00091|binding|INFO|Claiming lport 306c6369-38db-4281-af43-1db942f1ab64 for this chassis. Dec 6 05:15:51 localhost ovn_controller[131684]: 2025-12-06T10:15:51Z|00092|binding|INFO|306c6369-38db-4281-af43-1db942f1ab64: Claiming fa:16:3e:6b:db:20 19.80.0.222 Dec 6 05:15:51 localhost systemd-udevd[245948]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:15:51 localhost nova_compute[237281]: 2025-12-06 10:15:51.789 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:51 localhost NetworkManager[5965]: [1765016151.8070] device (tap51683506-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 05:15:51 localhost NetworkManager[5965]: [1765016151.8077] device (tap51683506-5f): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.803 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:48:58 10.100.0.13'], port_security=['fa:16:3e:4d:48:58 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-745716054', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83f60b2f-0c03-4557-a484-b92761cfecce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-745716054', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5d02d56e-c6b2-47de-b5f5-53b4d437a7c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28349eaf-1dbf-4bc7-ae61-616696dcb1a3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=51683506-5f1b-40d6-b898-12a44611e923) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.807 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:db:20 19.80.0.222'], port_security=['fa:16:3e:6b:db:20 19.80.0.222'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['51683506-5f1b-40d6-b898-12a44611e923'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1468155593', 'neutron:cidrs': '19.80.0.222/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1468155593', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5d02d56e-c6b2-47de-b5f5-53b4d437a7c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=39fd89f7-b191-45fe-9a9a-73b9abccf180, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=306c6369-38db-4281-af43-1db942f1ab64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:15:51 localhost ovn_controller[131684]: 2025-12-06T10:15:51Z|00093|binding|INFO|Setting lport 51683506-5f1b-40d6-b898-12a44611e923 ovn-installed in OVS Dec 6 05:15:51 localhost ovn_controller[131684]: 2025-12-06T10:15:51Z|00094|binding|INFO|Setting lport 51683506-5f1b-40d6-b898-12a44611e923 up in Southbound Dec 6 05:15:51 localhost ovn_controller[131684]: 2025-12-06T10:15:51Z|00095|binding|INFO|Setting lport 306c6369-38db-4281-af43-1db942f1ab64 up in Southbound Dec 6 05:15:51 localhost nova_compute[237281]: 2025-12-06 10:15:51.812 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.809 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 51683506-5f1b-40d6-b898-12a44611e923 in datapath 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 bound to our chassis#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.814 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ed911aa4-8580-4197-82cf-abe034f34c56 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.815 137259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9#033[00m Dec 6 05:15:51 localhost systemd-machined[68273]: New machine qemu-3-instance-00000008. Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.824 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[15d2ad97-3a4d-4d73-a04d-ae7a17dba25d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.826 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9d3dc0c9-c1 in ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.829 137360 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9d3dc0c9-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.829 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b51f9b0c-2209-4c88-87e7-68f71eb085a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.831 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[630e3bca-ea03-4b56-93fe-5da4211871b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost systemd[1]: Started Virtual Machine qemu-3-instance-00000008. Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.847 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[2c5d810a-91bb-4e0f-89dd-4993ecacd766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.861 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[53972e0b-9948-4cb9-8ecf-a9a2c146022d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.891 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[23546338-059c-432e-81e6-80c9ca8c77e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost systemd-udevd[245952]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.899 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[414ec2a9-3137-4e1b-a7e5-0bfd26ec6752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost NetworkManager[5965]: [1765016151.9005] manager: (tap9d3dc0c9-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/22) Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.930 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a39707-6998-4958-ae2f-b5436a55ac5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.934 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[63d2e637-8b27-48e6-b5c0-efdc023efe12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9d3dc0c9-c1: link becomes ready Dec 6 05:15:51 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9d3dc0c9-c0: link becomes ready Dec 6 05:15:51 localhost NetworkManager[5965]: [1765016151.9542] device (tap9d3dc0c9-c0): carrier: link connected Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.959 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[95b0c04f-c641-415d-a495-c825f9ee4df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:51 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:51.981 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[86d8e4d4-9e61-4b1a-8d24-621b0beabe59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d3dc0c9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:86:22:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1253909, 'reachable_time': 19819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245984, 'error': None, 'target': 'ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.000 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b57cae50-bb6e-4617-b924-e4e754afe1c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe86:22c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1253909, 'tstamp': 1253909}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245985, 'error': None, 'target': 'ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.020 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[64fd2ee9-e455-499a-9eef-ae96e6f1b325]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d3dc0c9-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:86:22:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1253909, 'reachable_time': 19819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245987, 'error': None, 'target': 'ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.053 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4398eb8e-de23-49b2-873a-ed734177d7cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.118 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6608b2bd-6720-42b2-a021-841ae5bf37d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.120 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d3dc0c9-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.121 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.121 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d3dc0c9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.124 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:52 localhost kernel: device tap9d3dc0c9-c0 entered promiscuous mode Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.133 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.133 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9d3dc0c9-c0, col_values=(('external_ids', {'iface-id': '004362b4-c291-41f5-bfb9-f0d2dbd76d4a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:52 localhost ovn_controller[131684]: 2025-12-06T10:15:52Z|00096|binding|INFO|Releasing lport 004362b4-c291-41f5-bfb9-f0d2dbd76d4a from this chassis (sb_readonly=0) Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.135 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.146 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.147 137259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.148 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e35c59a1-48ab-4467-be78-934cf5dbfb6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.149 137259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: global Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: log /dev/log local0 debug Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: log-tag haproxy-metadata-proxy-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: user root Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: group root Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: maxconn 1024 Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: pidfile /var/lib/neutron/external/pids/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9.pid.haproxy Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: daemon Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: defaults Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: log global Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: mode http Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: option httplog Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: option dontlognull Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: option http-server-close Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: option forwardfor Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: retries 3 Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: timeout http-request 30s Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: timeout connect 30s Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: timeout client 32s Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: timeout server 32s Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: timeout http-keep-alive 30s Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: listen listener Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: bind 169.254.169.254:80 Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: server metadata /var/lib/neutron/metadata_proxy Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: http-request add-header X-OVN-Network-ID 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.151 137259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'env', 'PROCESS_TAG=haproxy-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.302 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.303 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] VM Started (Lifecycle Event)#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.330 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.336 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.336 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] VM Paused (Lifecycle Event)#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.360 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.365 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.395 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.407 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.442 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.443 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.445 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.446 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.476 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.477 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.478 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.478 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:15:52 localhost podman[246027]: Dec 6 05:15:52 localhost podman[246027]: 2025-12-06 10:15:52.60935364 +0000 UTC m=+0.108021694 container create d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:15:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:15:52 localhost systemd[1]: Started libpod-conmon-d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d.scope. Dec 6 05:15:52 localhost podman[246027]: 2025-12-06 10:15:52.552050382 +0000 UTC m=+0.050718496 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 6 05:15:52 localhost systemd[1]: Started libcrun container. Dec 6 05:15:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41180d9d2ca10a29d35cd2145c9cc8a84576960f37cc926d13c7e67a91590e5b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:15:52 localhost podman[246027]: 2025-12-06 10:15:52.686594653 +0000 UTC m=+0.185262717 container init d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.690 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:52 localhost neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9[246044]: [NOTICE] (246058) : New worker (246064) forked Dec 6 05:15:52 localhost neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9[246044]: [NOTICE] (246058) : Loading success. Dec 6 05:15:52 localhost podman[246043]: 2025-12-06 10:15:52.748383619 +0000 UTC m=+0.089163092 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible) Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.763 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:52 localhost podman[246027]: 2025-12-06 10:15:52.765057234 +0000 UTC m=+0.263725288 container start d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.765 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.815 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 306c6369-38db-4281-af43-1db942f1ab64 in datapath ea5020da-f370-47b0-b6c4-a1f36329a7ad unbound from our chassis#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.821 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 82f6eb90-0ed7-4b33-86de-18f1c6d81c78 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.822 137259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ea5020da-f370-47b0-b6c4-a1f36329a7ad#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.831 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[710e4170-29c7-47b5-813a-4d0f9ec5c389]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.832 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapea5020da-f1 in ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.835 137360 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapea5020da-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.835 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b77260-5e68-41c3-86d9-a6c651c535ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.833 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.837 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b722ab6b-e454-47f2-98a3-3dd032e16ffc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.843 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.846 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[468d81f7-a588-4560-beed-181ec70177a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost podman[246043]: 2025-12-06 10:15:52.853977816 +0000 UTC m=+0.194757299 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.858 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b19ba8af-480d-4231-a93f-6c85962f5d74]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.886 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[bb9d8746-3cf1-4bd6-ba3e-52d8803daecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.892 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[77f5c52b-89ea-49d1-8b03-baaa06d2eb9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost systemd-udevd[245971]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:15:52 localhost NetworkManager[5965]: [1765016152.8956] manager: (tapea5020da-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/23) Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.903 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.905 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.926 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[140e455c-4d35-4294-bf10-77d88d21b2e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.930 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd21052-7894-4660-960f-92732aa5b86d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapea5020da-f1: link becomes ready Dec 6 05:15:52 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapea5020da-f0: link becomes ready Dec 6 05:15:52 localhost NetworkManager[5965]: [1765016152.9572] device (tapea5020da-f0): carrier: link connected Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.966 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[defa3b87-3443-44ea-844f-8870e5622fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.968 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:52 localhost nova_compute[237281]: 2025-12-06 10:15:52.969 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.980 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[37e1fb5e-ed25-4de0-b180-ae4c8483ffa3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea5020da-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c3:90:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1254009, 'reachable_time': 30920, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246105, 'error': None, 'target': 'ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:52.993 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[dcaf49ab-3d58-4a6a-b200-f67506aa2306]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:909e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1254009, 'tstamp': 1254009}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246107, 'error': None, 'target': 'ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.010 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[f65b5534-1a5e-4e9d-925c-5d22ad91d3a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea5020da-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c3:90:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1254009, 'reachable_time': 30920, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246108, 'error': None, 'target': 'ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.038 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[66f367ec-8c44-4418-b494-426ddea915ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.039 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.041 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.097 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[1415d7f3-a6fd-4c6d-8df8-540c82002ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.099 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea5020da-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.100 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.100 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea5020da-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:53 localhost kernel: device tapea5020da-f0 entered promiscuous mode Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.108 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapea5020da-f0, col_values=(('external_ids', {'iface-id': '70c45e60-e7ea-4bdb-926b-73d2cccf5054'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:15:53 localhost ovn_controller[131684]: 2025-12-06T10:15:53Z|00097|binding|INFO|Releasing lport 70c45e60-e7ea-4bdb-926b-73d2cccf5054 from this chassis (sb_readonly=0) Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.110 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.120 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.121 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.123 137259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea5020da-f370-47b0-b6c4-a1f36329a7ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea5020da-f370-47b0-b6c4-a1f36329a7ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.124 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[17cc83a9-ff6d-45b3-a24f-e93862485236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.126 137259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: global Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: log /dev/log local0 debug Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: log-tag haproxy-metadata-proxy-ea5020da-f370-47b0-b6c4-a1f36329a7ad Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: user root Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: group root Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: maxconn 1024 Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: pidfile /var/lib/neutron/external/pids/ea5020da-f370-47b0-b6c4-a1f36329a7ad.pid.haproxy Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: daemon Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: defaults Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: log global Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: mode http Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: option httplog Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: option dontlognull Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: option http-server-close Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: option forwardfor Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: retries 3 Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: timeout http-request 30s Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: timeout connect 30s Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: timeout client 32s Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: timeout server 32s Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: timeout http-keep-alive 30s Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: listen listener Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: bind 169.254.169.254:80 Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: server metadata /var/lib/neutron/metadata_proxy Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: http-request add-header X-OVN-Network-ID ea5020da-f370-47b0-b6c4-a1f36329a7ad Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 6 05:15:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:15:53.126 137259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'env', 'PROCESS_TAG=haproxy-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ea5020da-f370-47b0-b6c4-a1f36329a7ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 6 05:15:53 localhost podman[197801]: time="2025-12-06T10:15:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:15:53 localhost podman[197801]: @ - - [06/Dec/2025:10:15:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150806 "" "Go-http-client/1.1" Dec 6 05:15:53 localhost podman[197801]: @ - - [06/Dec/2025:10:15:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17849 "" "Go-http-client/1.1" Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.384 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.385 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12270MB free_disk=387.2653999328613GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.386 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.386 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.478 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.478 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance 83f60b2f-0c03-4557-a484-b92761cfecce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.479 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.479 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1152MB phys_disk=399GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.568 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:15:53 localhost podman[246145]: Dec 6 05:15:53 localhost podman[246145]: 2025-12-06 10:15:53.583788252 +0000 UTC m=+0.090769372 container create 4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.598 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:15:53 localhost systemd[1]: Started libpod-conmon-4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974.scope. Dec 6 05:15:53 localhost systemd[1]: Started libcrun container. Dec 6 05:15:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/960f010e83747e6abb01bc3d66788d5ee2c9a9d161112eb5703eabf2b62fd431/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:15:53 localhost podman[246145]: 2025-12-06 10:15:53.542933981 +0000 UTC m=+0.049915151 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.643 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.644 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:53 localhost podman[246145]: 2025-12-06 10:15:53.647832678 +0000 UTC m=+0.154813788 container init 4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:15:53 localhost podman[246145]: 2025-12-06 10:15:53.656884947 +0000 UTC m=+0.163866057 container start 4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.684 237285 DEBUG nova.network.neutron [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Updated VIF entry in instance network info cache for port 51683506-5f1b-40d6-b898-12a44611e923. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.685 237285 DEBUG nova.network.neutron [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Updating instance_info_cache with network_info: [{"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:15:53 localhost neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad[246160]: [NOTICE] (246164) : New worker (246166) forked Dec 6 05:15:53 localhost neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad[246160]: [NOTICE] (246164) : Loading success. Dec 6 05:15:53 localhost nova_compute[237281]: 2025-12-06 10:15:53.700 237285 DEBUG oslo_concurrency.lockutils [req-c3b573c1-01e6-4169-a942-d7e15f3fc9e6 req-0166c595-7172-4683-b68b-2006245d4cf0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Releasing lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:15:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51819 DF PROTO=TCP SPT=58764 DPT=9102 SEQ=4170017105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD76F9A0000000001030307) Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.411 237285 DEBUG nova.compute.manager [req-47f74e30-2244-4c39-87d1-e4fd4546c890 req-e049f6b6-811c-40fa-93e8-48e8c00adcae 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.411 237285 DEBUG oslo_concurrency.lockutils [req-47f74e30-2244-4c39-87d1-e4fd4546c890 req-e049f6b6-811c-40fa-93e8-48e8c00adcae 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.412 237285 DEBUG oslo_concurrency.lockutils [req-47f74e30-2244-4c39-87d1-e4fd4546c890 req-e049f6b6-811c-40fa-93e8-48e8c00adcae 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.413 237285 DEBUG oslo_concurrency.lockutils [req-47f74e30-2244-4c39-87d1-e4fd4546c890 req-e049f6b6-811c-40fa-93e8-48e8c00adcae 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.413 237285 DEBUG nova.compute.manager [req-47f74e30-2244-4c39-87d1-e4fd4546c890 req-e049f6b6-811c-40fa-93e8-48e8c00adcae 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Processing event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.414 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.419 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.420 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] VM Resumed (Lifecycle Event)#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.422 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.427 237285 INFO nova.virt.libvirt.driver [-] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Instance spawned successfully.#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.428 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.481 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.488 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.488 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.489 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.490 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.490 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.491 237285 DEBUG nova.virt.libvirt.driver [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.498 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.537 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.593 237285 INFO nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Took 12.97 seconds to spawn the instance on the hypervisor.#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.594 237285 DEBUG nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.656 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.663 237285 INFO nova.compute.manager [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Took 13.51 seconds to build instance.#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.685 237285 DEBUG oslo_concurrency.lockutils [None req-673dae40-f6bf-4955-b7d8-4db0a0cf331f 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:54 localhost nova_compute[237281]: 2025-12-06 10:15:54.998 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:15:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51820 DF PROTO=TCP SPT=58764 DPT=9102 SEQ=4170017105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD773870000000001030307) Dec 6 05:15:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59683 DF PROTO=TCP SPT=58538 DPT=9102 SEQ=160817535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD777870000000001030307) Dec 6 05:15:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:15:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:15:56 localhost systemd[1]: tmp-crun.9y1pol.mount: Deactivated successfully. Dec 6 05:15:56 localhost podman[246177]: 2025-12-06 10:15:56.630586707 +0000 UTC m=+0.154233859 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 6 05:15:56 localhost podman[246176]: 2025-12-06 10:15:56.590698747 +0000 UTC m=+0.116926269 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:15:56 localhost podman[246177]: 2025-12-06 10:15:56.667774215 +0000 UTC m=+0.191421387 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:15:56 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:15:56 localhost podman[246176]: 2025-12-06 10:15:56.725836855 +0000 UTC m=+0.252064438 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:15:56 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:15:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51821 DF PROTO=TCP SPT=58764 DPT=9102 SEQ=4170017105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD77B870000000001030307) Dec 6 05:15:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27586 DF PROTO=TCP SPT=57222 DPT=9102 SEQ=3674109737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD77FDA0000000001030307) Dec 6 05:15:58 localhost nova_compute[237281]: 2025-12-06 10:15:58.635 237285 DEBUG nova.compute.manager [req-88802187-655f-4174-ac15-3cab86d8834b req-96859c2e-1ecc-4b57-bc62-f57316b77528 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:15:58 localhost nova_compute[237281]: 2025-12-06 10:15:58.636 237285 DEBUG oslo_concurrency.lockutils [req-88802187-655f-4174-ac15-3cab86d8834b req-96859c2e-1ecc-4b57-bc62-f57316b77528 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:15:58 localhost nova_compute[237281]: 2025-12-06 10:15:58.636 237285 DEBUG oslo_concurrency.lockutils [req-88802187-655f-4174-ac15-3cab86d8834b req-96859c2e-1ecc-4b57-bc62-f57316b77528 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:15:58 localhost nova_compute[237281]: 2025-12-06 10:15:58.637 237285 DEBUG oslo_concurrency.lockutils [req-88802187-655f-4174-ac15-3cab86d8834b req-96859c2e-1ecc-4b57-bc62-f57316b77528 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:15:58 localhost nova_compute[237281]: 2025-12-06 10:15:58.637 237285 DEBUG nova.compute.manager [req-88802187-655f-4174-ac15-3cab86d8834b req-96859c2e-1ecc-4b57-bc62-f57316b77528 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:15:58 localhost nova_compute[237281]: 2025-12-06 10:15:58.637 237285 WARNING nova.compute.manager [req-88802187-655f-4174-ac15-3cab86d8834b req-96859c2e-1ecc-4b57-bc62-f57316b77528 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received unexpected event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with vm_state active and task_state None.#033[00m Dec 6 05:15:58 localhost nova_compute[237281]: 2025-12-06 10:15:58.641 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:15:59 localhost nova_compute[237281]: 2025-12-06 10:15:59.651 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:00 localhost nova_compute[237281]: 2025-12-06 10:16:00.000 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51822 DF PROTO=TCP SPT=58764 DPT=9102 SEQ=4170017105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD78B470000000001030307) Dec 6 05:16:02 localhost nova_compute[237281]: 2025-12-06 10:16:02.855 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:16:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:16:03 localhost podman[246219]: 2025-12-06 10:16:03.26523184 +0000 UTC m=+0.106421294 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:16:03 localhost systemd[1]: tmp-crun.OD6tRO.mount: Deactivated successfully. Dec 6 05:16:03 localhost podman[246218]: 2025-12-06 10:16:03.312668394 +0000 UTC m=+0.156520690 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 05:16:03 localhost podman[246218]: 2025-12-06 10:16:03.34823404 +0000 UTC m=+0.192086356 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:16:03 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:16:03 localhost podman[246219]: 2025-12-06 10:16:03.397381537 +0000 UTC m=+0.238570751 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, tcib_managed=true) Dec 6 05:16:03 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:16:04 localhost nova_compute[237281]: 2025-12-06 10:16:04.653 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:05 localhost nova_compute[237281]: 2025-12-06 10:16:05.003 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:05 localhost nova_compute[237281]: 2025-12-06 10:16:05.848 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Check if temp file /var/lib/nova/instances/tmpz6kieeuq exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m Dec 6 05:16:05 localhost nova_compute[237281]: 2025-12-06 10:16:05.850 237285 DEBUG nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] source check data is LibvirtLiveMigrateData(bdms=,block_migration=True,disk_available_mb=395264,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpz6kieeuq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='83f60b2f-0c03-4557-a484-b92761cfecce',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m Dec 6 05:16:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:06.700 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:06.700 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:06.702 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:07 localhost ovn_controller[131684]: 2025-12-06T10:16:07Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4d:48:58 10.100.0.13 Dec 6 05:16:07 localhost ovn_controller[131684]: 2025-12-06T10:16:07Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4d:48:58 10.100.0.13 Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.674 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.747 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.749 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.823 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.827 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.828 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.846 237285 INFO nova.compute.rpcapi [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m Dec 6 05:16:07 localhost nova_compute[237281]: 2025-12-06 10:16:07.848 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:16:08 localhost nova_compute[237281]: 2025-12-06 10:16:08.227 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51823 DF PROTO=TCP SPT=58764 DPT=9102 SEQ=4170017105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD7AB870000000001030307) Dec 6 05:16:09 localhost nova_compute[237281]: 2025-12-06 10:16:09.691 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:10 localhost sshd[246273]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:16:10 localhost nova_compute[237281]: 2025-12-06 10:16:10.005 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:10 localhost systemd-logind[760]: New session 48 of user nova. Dec 6 05:16:10 localhost systemd[1]: Created slice User Slice of UID 42436. Dec 6 05:16:10 localhost systemd[1]: Starting User Runtime Directory /run/user/42436... Dec 6 05:16:10 localhost systemd[1]: Finished User Runtime Directory /run/user/42436. Dec 6 05:16:10 localhost systemd[1]: Starting User Manager for UID 42436... Dec 6 05:16:10 localhost systemd[246277]: Queued start job for default target Main User Target. Dec 6 05:16:10 localhost systemd[246277]: Created slice User Application Slice. Dec 6 05:16:10 localhost systemd[246277]: Started Mark boot as successful after the user session has run 2 minutes. Dec 6 05:16:10 localhost systemd-journald[38691]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Dec 6 05:16:10 localhost systemd-journald[38691]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 6 05:16:10 localhost systemd[246277]: Started Daily Cleanup of User's Temporary Directories. Dec 6 05:16:10 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 05:16:10 localhost systemd[246277]: Reached target Paths. Dec 6 05:16:10 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 05:16:10 localhost systemd[246277]: Reached target Timers. Dec 6 05:16:10 localhost systemd[246277]: Starting D-Bus User Message Bus Socket... Dec 6 05:16:10 localhost systemd[246277]: Starting Create User's Volatile Files and Directories... Dec 6 05:16:10 localhost systemd[246277]: Finished Create User's Volatile Files and Directories. Dec 6 05:16:10 localhost systemd[246277]: Listening on D-Bus User Message Bus Socket. Dec 6 05:16:10 localhost systemd[246277]: Reached target Sockets. Dec 6 05:16:10 localhost systemd[246277]: Reached target Basic System. Dec 6 05:16:10 localhost systemd[246277]: Reached target Main User Target. Dec 6 05:16:10 localhost systemd[246277]: Startup finished in 161ms. Dec 6 05:16:10 localhost systemd[1]: Started User Manager for UID 42436. Dec 6 05:16:10 localhost systemd[1]: Started Session 48 of User nova. Dec 6 05:16:10 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 6 05:16:10 localhost systemd[1]: session-48.scope: Deactivated successfully. Dec 6 05:16:10 localhost systemd-logind[760]: Session 48 logged out. Waiting for processes to exit. Dec 6 05:16:10 localhost systemd-logind[760]: Removed session 48. Dec 6 05:16:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:16:11 localhost podman[246296]: 2025-12-06 10:16:11.564869045 +0000 UTC m=+0.089759010 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:16:11 localhost podman[246296]: 2025-12-06 10:16:11.579076855 +0000 UTC m=+0.103966790 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9) Dec 6 05:16:11 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:16:13 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:13.396 2 INFO neutron.agent.securitygroups_rpc [req-71fdcc25-8024-494d-bac1-d5cc729f25fb req-470f73dd-6820-4568-8d1c-98b5b018ba26 4dd0467bc60c4fcc8357de40a955e9e4 1b9a5cf38c7e4c52baa224d3c9813fb4 - - default default] Security group rule updated ['7b046070-ef91-4fdb-9bcf-aa71d6d8707c']#033[00m Dec 6 05:16:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:16:13 localhost podman[246316]: 2025-12-06 10:16:13.523935564 +0000 UTC m=+0.059172557 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:16:13 localhost podman[246316]: 2025-12-06 10:16:13.533203159 +0000 UTC m=+0.068440142 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:16:13 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.093 237285 INFO nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Took 6.27 seconds for pre_live_migration on destination host np0005548801.ooo.test.#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.094 237285 DEBUG nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.116 237285 DEBUG nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=395264,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpz6kieeuq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='83f60b2f-0c03-4557-a484-b92761cfecce',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d37c64e5-a5b0-4f07-89cf-68524547caa6),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.142 237285 DEBUG nova.objects.instance [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lazy-loading 'migration_context' on Instance uuid 83f60b2f-0c03-4557-a484-b92761cfecce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.143 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.145 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.146 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.162 237285 DEBUG nova.virt.libvirt.vif [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1908771692',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005548798.ooo.test',hostname='tempest-liveautoblockmigrationv225test-server-1908771692',id=8,image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T10:15:54Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d4ffab9068e64ee89c49785c5f76ecd3',ramdisk_id='',reservation_id='r-w9ig3u00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1195199977',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1195199977-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-12-06T10:15:54Z,user_data=None,user_id='21240df215a544878bd6d4c5ec47594a',uuid=83f60b2f-0c03-4557-a484-b92761cfecce,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.162 237285 DEBUG nova.network.os_vif_util [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Converting VIF {"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.164 237285 DEBUG nova.network.os_vif_util [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.165 237285 DEBUG nova.virt.libvirt.migration [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Updating guest XML with vif config: Dec 6 05:16:14 localhost nova_compute[237281]: Dec 6 05:16:14 localhost nova_compute[237281]: Dec 6 05:16:14 localhost nova_compute[237281]: Dec 6 05:16:14 localhost nova_compute[237281]: Dec 6 05:16:14 localhost nova_compute[237281]: Dec 6 05:16:14 localhost nova_compute[237281]: Dec 6 05:16:14 localhost nova_compute[237281]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.166 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m Dec 6 05:16:14 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:14.523 2 INFO neutron.agent.securitygroups_rpc [None req-1549da0a-b9e0-467b-af7e-7917d7cc6330 383605419be14147bc2ebf82a90ac1b9 caceedbf61904a5eaba72910f7a24db1 - - default default] Security group member updated ['eb2751e0-e02b-43b7-b38b-b824ce1a45d2']#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.649 237285 DEBUG nova.virt.libvirt.migration [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.650 237285 INFO nova.virt.libvirt.migration [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.717 237285 INFO nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m Dec 6 05:16:14 localhost nova_compute[237281]: 2025-12-06 10:16:14.730 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:15 localhost nova_compute[237281]: 2025-12-06 10:16:15.007 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:15 localhost nova_compute[237281]: 2025-12-06 10:16:15.221 237285 DEBUG nova.virt.libvirt.migration [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Dec 6 05:16:15 localhost nova_compute[237281]: 2025-12-06 10:16:15.222 237285 DEBUG nova.virt.libvirt.migration [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m Dec 6 05:16:15 localhost nova_compute[237281]: 2025-12-06 10:16:15.727 237285 DEBUG nova.virt.libvirt.migration [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Dec 6 05:16:15 localhost nova_compute[237281]: 2025-12-06 10:16:15.727 237285 DEBUG nova.virt.libvirt.migration [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.064 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.065 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] VM Paused (Lifecycle Event)#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.089 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.095 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.116 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m Dec 6 05:16:16 localhost openstack_network_exporter[199751]: ERROR 10:16:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:16:16 localhost openstack_network_exporter[199751]: ERROR 10:16:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:16:16 localhost openstack_network_exporter[199751]: ERROR 10:16:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:16:16 localhost openstack_network_exporter[199751]: ERROR 10:16:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:16:16 localhost openstack_network_exporter[199751]: Dec 6 05:16:16 localhost openstack_network_exporter[199751]: ERROR 10:16:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:16:16 localhost openstack_network_exporter[199751]: Dec 6 05:16:16 localhost kernel: device tap51683506-5f left promiscuous mode Dec 6 05:16:16 localhost NetworkManager[5965]: [1765016176.2415] device (tap51683506-5f): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00098|binding|INFO|Releasing lport 51683506-5f1b-40d6-b898-12a44611e923 from this chassis (sb_readonly=0) Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00099|binding|INFO|Setting lport 51683506-5f1b-40d6-b898-12a44611e923 down in Southbound Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00100|binding|INFO|Releasing lport 306c6369-38db-4281-af43-1db942f1ab64 from this chassis (sb_readonly=0) Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00101|binding|INFO|Setting lport 306c6369-38db-4281-af43-1db942f1ab64 down in Southbound Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00102|binding|INFO|Removing iface tap51683506-5f ovn-installed in OVS Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.255 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00103|binding|INFO|Releasing lport 004362b4-c291-41f5-bfb9-f0d2dbd76d4a from this chassis (sb_readonly=0) Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00104|binding|INFO|Releasing lport 70c45e60-e7ea-4bdb-926b-73d2cccf5054 from this chassis (sb_readonly=0) Dec 6 05:16:16 localhost ovn_controller[131684]: 2025-12-06T10:16:16Z|00105|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.269 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:48:58 10.100.0.13'], port_security=['fa:16:3e:4d:48:58 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test,np0005548801.ooo.test', 'activation-strategy': 'rarp', 'additional-chassis-activated': '179bc2f0-15c6-4f13-97f9-10613dbf5d7c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-745716054', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83f60b2f-0c03-4557-a484-b92761cfecce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-745716054', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5d02d56e-c6b2-47de-b5f5-53b4d437a7c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28349eaf-1dbf-4bc7-ae61-616696dcb1a3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=51683506-5f1b-40d6-b898-12a44611e923) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.272 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:db:20 19.80.0.222'], port_security=['fa:16:3e:6b:db:20 19.80.0.222'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['51683506-5f1b-40d6-b898-12a44611e923'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1468155593', 'neutron:cidrs': '19.80.0.222/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1468155593', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '5d02d56e-c6b2-47de-b5f5-53b4d437a7c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=39fd89f7-b191-45fe-9a9a-73b9abccf180, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=306c6369-38db-4281-af43-1db942f1ab64) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.274 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 51683506-5f1b-40d6-b898-12a44611e923 in datapath 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 unbound from our chassis#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.278 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ed911aa4-8580-4197-82cf-abe034f34c56 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.278 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.279 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c834d1-873b-4444-874e-342e77c6b219]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.280 137259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 namespace which is not needed anymore#033[00m Dec 6 05:16:16 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Deactivated successfully. Dec 6 05:16:16 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Consumed 12.948s CPU time. Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.287 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:16 localhost systemd-machined[68273]: Machine qemu-3-instance-00000008 terminated. Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9[246044]: [NOTICE] (246058) : haproxy version is 2.8.14-c23fe91 Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9[246044]: [NOTICE] (246058) : path to executable is /usr/sbin/haproxy Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9[246044]: [WARNING] (246058) : Exiting Master process... Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9[246044]: [ALERT] (246058) : Current worker (246064) exited with code 143 (Terminated) Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9[246044]: [WARNING] (246058) : All workers exited. Exiting... (0) Dec 6 05:16:16 localhost systemd[1]: libpod-d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d.scope: Deactivated successfully. Dec 6 05:16:16 localhost podman[246367]: 2025-12-06 10:16:16.439344965 +0000 UTC m=+0.056303138 container died d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:16:16 localhost systemd[1]: tmp-crun.aVXwPM.mount: Deactivated successfully. Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.475 237285 DEBUG nova.virt.libvirt.guest [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.477 237285 INFO nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Migration operation has completed#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.477 237285 INFO nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] _post_live_migration() is started..#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.494 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.494 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.495 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m Dec 6 05:16:16 localhost podman[246367]: 2025-12-06 10:16:16.500976796 +0000 UTC m=+0.117934959 container cleanup d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:16:16 localhost podman[246386]: 2025-12-06 10:16:16.511655936 +0000 UTC m=+0.065600825 container cleanup d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:16:16 localhost systemd[1]: libpod-conmon-d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d.scope: Deactivated successfully. Dec 6 05:16:16 localhost podman[246413]: 2025-12-06 10:16:16.588297611 +0000 UTC m=+0.068239747 container remove d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.591 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a707b256-f6ed-4397-8250-9f409946979b]: (4, ('Sat Dec 6 10:16:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 (d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d)\nd642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d\nSat Dec 6 10:16:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 (d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d)\nd642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.593 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[f5299f1d-b8c9-4fee-9ac2-6edd238f8598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.594 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d3dc0c9-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.596 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:16 localhost kernel: device tap9d3dc0c9-c0 left promiscuous mode Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.608 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.610 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[f275c676-946d-4375-84c9-50de4c333094]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.625 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[abc3402e-eaf4-415d-a5cb-7827b18f61aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.626 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c77b835a-8557-4556-9645-f129ef43bbce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.641 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2f3cb40f-5674-4be3-94e2-a847bdfefb47]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1253902, 'reachable_time': 39735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246432, 'error': None, 'target': 'ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.643 137391 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.644 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[8781340e-e022-4c24-8e09-14395dd69b7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.644 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 306c6369-38db-4281-af43-1db942f1ab64 in datapath ea5020da-f370-47b0-b6c4-a1f36329a7ad unbound from our chassis#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.646 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 82f6eb90-0ed7-4b33-86de-18f1c6d81c78 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.646 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea5020da-f370-47b0-b6c4-a1f36329a7ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.647 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[df386071-adbd-44a6-91b2-28eed968a2bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.648 137259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad namespace which is not needed anymore#033[00m Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad[246160]: [NOTICE] (246164) : haproxy version is 2.8.14-c23fe91 Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad[246160]: [NOTICE] (246164) : path to executable is /usr/sbin/haproxy Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad[246160]: [WARNING] (246164) : Exiting Master process... Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad[246160]: [ALERT] (246164) : Current worker (246166) exited with code 143 (Terminated) Dec 6 05:16:16 localhost neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad[246160]: [WARNING] (246164) : All workers exited. Exiting... (0) Dec 6 05:16:16 localhost systemd[1]: libpod-4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974.scope: Deactivated successfully. Dec 6 05:16:16 localhost podman[246449]: 2025-12-06 10:16:16.824417045 +0000 UTC m=+0.071597930 container died 4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:16:16 localhost podman[246449]: 2025-12-06 10:16:16.866365489 +0000 UTC m=+0.113546284 container cleanup 4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:16:16 localhost podman[246462]: 2025-12-06 10:16:16.878038149 +0000 UTC m=+0.049864180 container cleanup 4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:16:16 localhost systemd[1]: libpod-conmon-4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974.scope: Deactivated successfully. Dec 6 05:16:16 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:16.906 2 INFO neutron.agent.securitygroups_rpc [req-c3123177-2528-4257-8c27-db3c33f7eb3d req-48803639-0091-487e-8e58-5593e0c8a80e 4dd0467bc60c4fcc8357de40a955e9e4 1b9a5cf38c7e4c52baa224d3c9813fb4 - - default default] Security group rule updated ['a55a5eff-aebb-4270-9f26-2bb56f0d6d53']#033[00m Dec 6 05:16:16 localhost podman[246478]: 2025-12-06 10:16:16.948593535 +0000 UTC m=+0.066300845 container remove 4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.952 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[47e752f9-c2c8-4c49-8f4c-637c3ddca668]: (4, ('Sat Dec 6 10:16:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad (4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974)\n4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974\nSat Dec 6 10:16:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad (4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974)\n4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.953 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e45a6be9-f71a-467d-a7c5-f1af66606379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.954 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea5020da-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.956 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:16 localhost kernel: device tapea5020da-f0 left promiscuous mode Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.968 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:16 localhost nova_compute[237281]: 2025-12-06 10:16:16.969 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.971 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[34fbc825-ffc9-45f2-a7cf-f12127dfdb3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.994 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8d9baa-ae76-4cb1-8732-15567884b83d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:16.995 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba33761-81ba-4a81-86c5-59ee19357070]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:17.007 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c7cf1641-be69-4cd6-86e1-a3507728038c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1254001, 'reachable_time': 40959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246499, 'error': None, 'target': 'ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:17.008 137391 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ea5020da-f370-47b0-b6c4-a1f36329a7ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 6 05:16:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:17.008 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[051ade88-3124-4935-a981-93efa7fdaecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:17 localhost systemd[1]: tmp-crun.C3GDY4.mount: Deactivated successfully. Dec 6 05:16:17 localhost systemd[1]: var-lib-containers-storage-overlay-960f010e83747e6abb01bc3d66788d5ee2c9a9d161112eb5703eabf2b62fd431-merged.mount: Deactivated successfully. Dec 6 05:16:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ac1f16c0e8d597940533d00004bb0d55828907f32478b751e67b46c78b6c974-userdata-shm.mount: Deactivated successfully. Dec 6 05:16:17 localhost systemd[1]: run-netns-ovnmeta\x2dea5020da\x2df370\x2d47b0\x2db6c4\x2da1f36329a7ad.mount: Deactivated successfully. Dec 6 05:16:17 localhost systemd[1]: var-lib-containers-storage-overlay-41180d9d2ca10a29d35cd2145c9cc8a84576960f37cc926d13c7e67a91590e5b-merged.mount: Deactivated successfully. Dec 6 05:16:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d642c34fd4cdbdc5330f9e7825002a00257a05873de0f6fe67390954e7775c9d-userdata-shm.mount: Deactivated successfully. Dec 6 05:16:17 localhost systemd[1]: run-netns-ovnmeta\x2d9d3dc0c9\x2dc5ba\x2d4357\x2d8485\x2d3e7e8def3fd9.mount: Deactivated successfully. Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.087 237285 DEBUG nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.088 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.088 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.089 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.090 237285 DEBUG nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.090 237285 DEBUG nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.090 237285 DEBUG nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.091 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.091 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.092 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.092 237285 DEBUG nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.093 237285 WARNING nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received unexpected event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with vm_state active and task_state migrating.#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.093 237285 DEBUG nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-changed-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.093 237285 DEBUG nova.compute.manager [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Refreshing instance network info cache due to event network-changed-51683506-5f1b-40d6-b898-12a44611e923. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.094 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.094 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquired lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.095 237285 DEBUG nova.network.neutron [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Refreshing network info cache for port 51683506-5f1b-40d6-b898-12a44611e923 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Dec 6 05:16:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:18.105 2 INFO neutron.agent.securitygroups_rpc [None req-b1428d03-0724-47dc-9742-1c2a4921737c 383605419be14147bc2ebf82a90ac1b9 caceedbf61904a5eaba72910f7a24db1 - - default default] Security group member updated ['eb2751e0-e02b-43b7-b38b-b824ce1a45d2']#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.457 237285 DEBUG nova.network.neutron [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Activated binding for port 51683506-5f1b-40d6-b898-12a44611e923 and host np0005548801.ooo.test migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.458 237285 DEBUG nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.460 237285 DEBUG nova.virt.libvirt.vif [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:15:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1908771692',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005548798.ooo.test',hostname='tempest-liveautoblockmigrationv225test-server-1908771692',id=8,image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-06T10:15:54Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d4ffab9068e64ee89c49785c5f76ecd3',ramdisk_id='',reservation_id='r-w9ig3u00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1195199977',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1195199977-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-12-06T10:16:05Z,user_data=None,user_id='21240df215a544878bd6d4c5ec47594a',uuid=83f60b2f-0c03-4557-a484-b92761cfecce,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.460 237285 DEBUG nova.network.os_vif_util [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Converting VIF {"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.462 237285 DEBUG nova.network.os_vif_util [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.462 237285 DEBUG os_vif [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.465 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.466 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51683506-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.500 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.502 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.504 237285 INFO os_vif [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4d:48:58,bridge_name='br-int',has_traffic_filtering=True,id=51683506-5f1b-40d6-b898-12a44611e923,network=Network(9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap51683506-5f')#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.505 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.506 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.506 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.507 237285 DEBUG nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.508 237285 INFO nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Deleting instance files /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce_del#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.509 237285 INFO nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Deletion of /var/lib/nova/instances/83f60b2f-0c03-4557-a484-b92761cfecce_del complete#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.568 237285 DEBUG nova.compute.manager [req-cf1e7808-8831-4e68-8734-01b114ea2d9f req-5f3c1825-d97e-4dae-a142-14dd584548db 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.569 237285 DEBUG oslo_concurrency.lockutils [req-cf1e7808-8831-4e68-8734-01b114ea2d9f req-5f3c1825-d97e-4dae-a142-14dd584548db 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.569 237285 DEBUG oslo_concurrency.lockutils [req-cf1e7808-8831-4e68-8734-01b114ea2d9f req-5f3c1825-d97e-4dae-a142-14dd584548db 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.570 237285 DEBUG oslo_concurrency.lockutils [req-cf1e7808-8831-4e68-8734-01b114ea2d9f req-5f3c1825-d97e-4dae-a142-14dd584548db 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.570 237285 DEBUG nova.compute.manager [req-cf1e7808-8831-4e68-8734-01b114ea2d9f req-5f3c1825-d97e-4dae-a142-14dd584548db 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:18 localhost nova_compute[237281]: 2025-12-06 10:16:18.571 237285 DEBUG nova.compute.manager [req-cf1e7808-8831-4e68-8734-01b114ea2d9f req-5f3c1825-d97e-4dae-a142-14dd584548db 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Dec 6 05:16:19 localhost nova_compute[237281]: 2025-12-06 10:16:19.733 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:20 localhost systemd[1]: Stopping User Manager for UID 42436... Dec 6 05:16:20 localhost systemd[246277]: Activating special unit Exit the Session... Dec 6 05:16:20 localhost systemd[246277]: Stopped target Main User Target. Dec 6 05:16:20 localhost systemd[246277]: Stopped target Basic System. Dec 6 05:16:20 localhost systemd[246277]: Stopped target Paths. Dec 6 05:16:20 localhost systemd[246277]: Stopped target Sockets. Dec 6 05:16:20 localhost systemd[246277]: Stopped target Timers. Dec 6 05:16:20 localhost systemd[246277]: Stopped Mark boot as successful after the user session has run 2 minutes. Dec 6 05:16:20 localhost systemd[246277]: Stopped Daily Cleanup of User's Temporary Directories. Dec 6 05:16:20 localhost systemd[246277]: Closed D-Bus User Message Bus Socket. Dec 6 05:16:20 localhost systemd[246277]: Stopped Create User's Volatile Files and Directories. Dec 6 05:16:20 localhost systemd[246277]: Removed slice User Application Slice. Dec 6 05:16:20 localhost systemd[246277]: Reached target Shutdown. Dec 6 05:16:20 localhost systemd[246277]: Finished Exit the Session. Dec 6 05:16:20 localhost systemd[246277]: Reached target Exit the Session. Dec 6 05:16:20 localhost systemd[1]: user@42436.service: Deactivated successfully. Dec 6 05:16:20 localhost systemd[1]: Stopped User Manager for UID 42436. Dec 6 05:16:20 localhost systemd[1]: Stopping User Runtime Directory /run/user/42436... Dec 6 05:16:20 localhost systemd[1]: run-user-42436.mount: Deactivated successfully. Dec 6 05:16:20 localhost systemd[1]: user-runtime-dir@42436.service: Deactivated successfully. Dec 6 05:16:20 localhost systemd[1]: Stopped User Runtime Directory /run/user/42436. Dec 6 05:16:20 localhost systemd[1]: Removed slice User Slice of UID 42436. Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.652 237285 DEBUG nova.compute.manager [req-bde402d6-fe98-4240-b232-3e4a04cc6e8c req-c5f8f863-5e9a-4c83-855d-74bb7bf3c2f4 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.654 237285 DEBUG oslo_concurrency.lockutils [req-bde402d6-fe98-4240-b232-3e4a04cc6e8c req-c5f8f863-5e9a-4c83-855d-74bb7bf3c2f4 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.655 237285 DEBUG oslo_concurrency.lockutils [req-bde402d6-fe98-4240-b232-3e4a04cc6e8c req-c5f8f863-5e9a-4c83-855d-74bb7bf3c2f4 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.655 237285 DEBUG oslo_concurrency.lockutils [req-bde402d6-fe98-4240-b232-3e4a04cc6e8c req-c5f8f863-5e9a-4c83-855d-74bb7bf3c2f4 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.655 237285 DEBUG nova.compute.manager [req-bde402d6-fe98-4240-b232-3e4a04cc6e8c req-c5f8f863-5e9a-4c83-855d-74bb7bf3c2f4 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.656 237285 DEBUG nova.compute.manager [req-bde402d6-fe98-4240-b232-3e4a04cc6e8c req-c5f8f863-5e9a-4c83-855d-74bb7bf3c2f4 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-unplugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.957 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.958 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.958 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.959 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.959 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.959 237285 WARNING nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received unexpected event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with vm_state active and task_state migrating.#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.960 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.960 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.960 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.961 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.961 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.961 237285 WARNING nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received unexpected event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with vm_state active and task_state migrating.#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.962 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.962 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.962 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.963 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.963 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.963 237285 WARNING nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received unexpected event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with vm_state active and task_state migrating.#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.963 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.964 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.964 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.964 237285 DEBUG oslo_concurrency.lockutils [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.965 237285 DEBUG nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] No waiting events found dispatching network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:16:21 localhost nova_compute[237281]: 2025-12-06 10:16:21.965 237285 WARNING nova.compute.manager [req-782e7b3a-5bee-435a-9143-058f7de300f0 req-8b9fd956-9e21-4d80-b78e-1053c9229a27 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Received unexpected event network-vif-plugged-51683506-5f1b-40d6-b898-12a44611e923 for instance with vm_state active and task_state migrating.#033[00m Dec 6 05:16:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:22.992 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:16:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:22.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.008 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.009 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c1912d6-1eb1-413d-8da1-9d1b68829630', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:22.993792', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9d9d9506-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.203940015, 'message_signature': '40914870c78a088f0956ec6661c8308de648953c07dd4a878a03aa1481f98c49'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:22.993792', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9d9da97e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.203940015, 'message_signature': 'b47efbb11ab9c4e5335ef7ef610dc59be4b5b4bcdb83305c5bb38228402ba4d3'}]}, 'timestamp': '2025-12-06 10:16:23.009758', '_unique_id': '25f305a2558e4f539fed2178c0751b28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.011 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.050 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.050 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3b407bc-06ad-44f3-bc90-20872df510b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.012914', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9da3ebfe-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '304df65b2680e297072869df8397741b0573b748869acc6178d18011f4ef2117'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.012914', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9da4027e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '533bc8537b1da53816c303aad481353dd4e607b1f4a71c13960fd21c6297ae7d'}]}, 'timestamp': '2025-12-06 10:16:23.051354', '_unique_id': '71c56f3bd9c94cef925119f4420d8722'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.052 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.057 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9eace78-9b3e-41cb-b732-018ad59bdf64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.054017', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9da50b1a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': '549dbf94c556aee5dbd95fbed33b08ae2b9d9cba4a2ec7cdee86767d13377572'}]}, 'timestamp': '2025-12-06 10:16:23.058165', '_unique_id': '96963961474443dfb124ef6a0af3ae25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.060 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.061 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16afdb1e-b107-451d-b669-6de493e13eb4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.061020', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9da58f04-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': '42236dc13bfa41ddcfb97b709b55d3fdd68fbc02b450eef127b8c740a9012a25'}]}, 'timestamp': '2025-12-06 10:16:23.061615', '_unique_id': '571212c451f749cba777ecfd40b8d07b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.064 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.064 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.064 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3850de4-fe7f-4575-bf5f-9710b7bccb22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.064304', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9da60f10-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': 'c2c35634141fb99852602302ba6bc820373f598904fd099c4b216fa9c232c12f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.064304', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9da625e0-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '4582bb57583387308e7c71d70b77220484345ac6dbefb07cd34712a3f4fc845c'}]}, 'timestamp': '2025-12-06 10:16:23.065362', '_unique_id': '697b2f7c4c744233b0fb0556630b3630'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.066 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.068 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.068 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55c0486c-90d0-4cc8-9b69-5b489b3ee921', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.068005', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9da69ffc-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '797a2cc1dc30f8b3b2221c38595eabffb0847abb30603e0aa4ff1e8df7f04ec2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.068005', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9da6b154-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '6c8915e305a9fb43731f7a500f1260a0e015f274c50e3ccaf80d32a3d3d9fc98'}]}, 'timestamp': '2025-12-06 10:16:23.069049', '_unique_id': '6dc5553e895f4afcb758d33bb42f637d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.070 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.071 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.071 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70fc2a13-33b8-41d7-abc2-ea1e0b201da2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.071572', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9da72ae4-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': '77d6fc9284a588d677912ef50331fef411ba680daa6f62a43277690fb15a3f49'}]}, 'timestamp': '2025-12-06 10:16:23.072179', '_unique_id': '02f20121f6e240bcbf610205f08c74ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.074 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.074 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.074 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.075 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63eb7660-4e82-4733-8f80-ec7151b6d40e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.074879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9da7af14-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': 'fc081e62bab829b2bf7e93bf66a498c254a7bda2af0422166e50bf0f286968b2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.074879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9da7c0da-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '92bd6dc910bcf4ead665db1a1c201e9df42da8942dd9ec5b1d93b9cc47d5b74c'}]}, 'timestamp': '2025-12-06 10:16:23.075901', '_unique_id': '8fccb5564d4a406cab165132f975508c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.076 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.078 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 17370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34bc8b9c-d6e9-4d87-b097-35d3da15c5c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17370000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:16:23.078473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9dab1e38-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.307199341, 'message_signature': '6977d30f5a2e14af1aec45c4653d340b1206cb887bb9efb0f7f6613a71eddc88'}]}, 'timestamp': '2025-12-06 10:16:23.098082', '_unique_id': '826747cf5e2e4a139a970fba322a8b1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.101 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1695eec-82d8-4c86-9a69-563364c9e193', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.101169', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9dabaf92-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': 'f405182994336d0352b3dfaf93f2b1e9cb9c70a26bb1f0aeab4ce656a6022c26'}]}, 'timestamp': '2025-12-06 10:16:23.101675', '_unique_id': 'c6d5c603b0d04ea38c3e2c16df8595d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17dd18df-e650-4d66-90d2-a7bc17830c19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.104026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9dac1e8c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '7210fe516841d0c9a39f7deb5a1346f693ee1d20a236122d882a01359a1f58f2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.104026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9dac2fbc-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': 'd9625d735eff5ee395f261979ff7d49aa262872b5fb10d594a9f08b0ab187a50'}]}, 'timestamp': '2025-12-06 10:16:23.104956', '_unique_id': '99980b5c1467444c833f0f0169c619d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.107 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5e5de86-37f9-41b6-bef9-35f2c0f376df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.107408', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9daca258-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': 'ec28c03ad997957761c6fd2e46f083dc53e7629d629277553ce332d0c0b0aaf9'}]}, 'timestamp': '2025-12-06 10:16:23.107916', '_unique_id': '56cac5ab5e8745ddbeea7f7254e0952a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.108 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.110 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d29dd90-3bb8-499e-a030-4cf206185c6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.110060', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9dad0acc-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': '998ee0bc3822c987fe4d43074d9deb41f2fbb887de9a71634714136452c8645e'}]}, 'timestamp': '2025-12-06 10:16:23.110567', '_unique_id': '1f56b75947a64323aab01a1f6d6ef14b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26e49270-092b-4742-aec4-8370cc0dce41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.113234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9dad8b32-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.203940015, 'message_signature': '73666ad275455987232a30a7d388b346dcc039865f682f75208a1406fa77d0d7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.113234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9dada3f6-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.203940015, 'message_signature': 'aee45ac41cfe214676a3a5be30f3c191cab28509a948a41726f03b6dc658421b'}]}, 'timestamp': '2025-12-06 10:16:23.114473', '_unique_id': 'daf0b6e393e94d2d88df4af8d1dff72a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f89c9b8-1ea7-4113-90c7-9a05cb99abb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.117303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9dae2858-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '9948514aeb5ba775fa825a131d8e9940a736acdf1dead1a14bb80eddb55bb1e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.117303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9dae3c58-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.223044895, 'message_signature': '5bee15b7235750e93bd79e7fbc161ecbe00568fe3302c3b2e2cc7e0cd7d6569c'}]}, 'timestamp': '2025-12-06 10:16:23.118377', '_unique_id': 'ccfdfc91764e419493f81636ea676ef1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4eab1e4-31f5-452f-a154-852e76d7edaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.120730', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9daeac4c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': '4c07d5f070bcbb91a1a96e6a74c2a04fda600443757c1dc87f820e82808c846d'}]}, 'timestamp': '2025-12-06 10:16:23.121251', '_unique_id': '256f252e016241b18a6ff903019192cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c7ff8ca-0a22-4df4-baa2-e8ed300bbc55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.123515', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9daf17ae-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': 'a557019f45f4983f598f4ba75efa4b84d7b477bc311c3f06c1e99bda4a620cc3'}]}, 'timestamp': '2025-12-06 10:16:23.124026', '_unique_id': 'f075b8a8fdf84908a49063d29402d345'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.126 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2ffa101-3880-4d26-ade3-f919269d3e46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:16:23.126347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9daf864e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.307199341, 'message_signature': 'f9d9b6a992ffdcaa77a9fc40cb4148945f8bb17757788c2b0e9886b4461e8017'}]}, 'timestamp': '2025-12-06 10:16:23.126875', '_unique_id': '920f21f8dc8e4b718edda2db42654d79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.129 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b12c5a8e-403e-46ad-947b-4ba403dfed7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:16:23.129553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9db0038a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.203940015, 'message_signature': 'd21b06f2b5138d18a0ca4b34947b793924852ac850af4bcabeffc778335a017d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:16:23.129553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9db01640-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.203940015, 'message_signature': 'ae0326419a791741fd3fb9f42b33683bce42b3f08b34c14f257f30d1d17ff5bb'}]}, 'timestamp': '2025-12-06 10:16:23.130493', '_unique_id': '5223645e860f45f19342ebdab7e7615c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.131 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.132 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85985169-553d-4295-bf3b-721d9bbfe468', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.132115', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9db0641a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': 'ad31b58254e482ba8124d48a26442b6ac5834138fd9fbbb479ecde7916495347'}]}, 'timestamp': '2025-12-06 10:16:23.132413', '_unique_id': '07f0f170b5be4492a6dcd0f25846eb58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.133 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc9392db-b67d-43d7-9f06-a414acb132a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:16:23.133743', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9db0a452-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12570.264146513, 'message_signature': 'a354df6ec46a939c96ecc9232bcf3e0044d746f8f50d1cc8ca05f3fe08a5f6f1'}]}, 'timestamp': '2025-12-06 10:16:23.134057', '_unique_id': '55e17171ac1041fda1437ee800361122'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:16:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:16:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:16:23 localhost podman[197801]: time="2025-12-06T10:16:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:16:23 localhost podman[197801]: @ - - [06/Dec/2025:10:16:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149620 "" "Go-http-client/1.1" Dec 6 05:16:23 localhost podman[197801]: @ - - [06/Dec/2025:10:16:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17379 "" "Go-http-client/1.1" Dec 6 05:16:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:16:23 localhost nova_compute[237281]: 2025-12-06 10:16:23.531 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:23 localhost podman[246501]: 2025-12-06 10:16:23.578370108 +0000 UTC m=+0.104894388 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:16:23 localhost podman[246501]: 2025-12-06 10:16:23.61054828 +0000 UTC m=+0.137072560 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:16:23 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:16:23 localhost nova_compute[237281]: 2025-12-06 10:16:23.900 237285 DEBUG nova.network.neutron [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Updated VIF entry in instance network info cache for port 51683506-5f1b-40d6-b898-12a44611e923. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Dec 6 05:16:23 localhost nova_compute[237281]: 2025-12-06 10:16:23.901 237285 DEBUG nova.network.neutron [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Updating instance_info_cache with network_info: [{"id": "51683506-5f1b-40d6-b898-12a44611e923", "address": "fa:16:3e:4d:48:58", "network": {"id": "9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1133610127-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d4ffab9068e64ee89c49785c5f76ecd3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51683506-5f", "ovs_interfaceid": "51683506-5f1b-40d6-b898-12a44611e923", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:16:23 localhost nova_compute[237281]: 2025-12-06 10:16:23.944 237285 DEBUG oslo_concurrency.lockutils [req-10430f73-7b9e-4a26-a831-22b3c1d15b97 req-e7497e13-58a3-4af3-81b7-1aa4facb57f0 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Releasing lock "refresh_cache-83f60b2f-0c03-4557-a484-b92761cfecce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:16:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20861 DF PROTO=TCP SPT=42226 DPT=9102 SEQ=4149398504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD7E4C90000000001030307) Dec 6 05:16:24 localhost nova_compute[237281]: 2025-12-06 10:16:24.772 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20862 DF PROTO=TCP SPT=42226 DPT=9102 SEQ=4149398504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD7E8C80000000001030307) Dec 6 05:16:25 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:25.668 2 INFO neutron.agent.securitygroups_rpc [req-38c1369b-1a91-4db5-98c0-5f7f3cfc63b1 req-73b32b40-3434-46cc-8ee4-48d300fa8585 4dd0467bc60c4fcc8357de40a955e9e4 1b9a5cf38c7e4c52baa224d3c9813fb4 - - default default] Security group rule updated ['96590374-a8db-4074-8e65-8e9f381f5f98']#033[00m Dec 6 05:16:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51824 DF PROTO=TCP SPT=58764 DPT=9102 SEQ=4170017105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD7EB870000000001030307) Dec 6 05:16:25 localhost ovn_controller[131684]: 2025-12-06T10:16:25Z|00106|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:16:25 localhost nova_compute[237281]: 2025-12-06 10:16:25.863 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20863 DF PROTO=TCP SPT=42226 DPT=9102 SEQ=4149398504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD7F0C80000000001030307) Dec 6 05:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:16:27 localhost ovn_controller[131684]: 2025-12-06T10:16:27Z|00107|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:16:27 localhost nova_compute[237281]: 2025-12-06 10:16:27.514 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:27 localhost podman[246527]: 2025-12-06 10:16:27.549675403 +0000 UTC m=+0.082663920 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:16:27 localhost podman[246527]: 2025-12-06 10:16:27.562432497 +0000 UTC m=+0.095420994 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:16:27 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:16:27 localhost podman[246528]: 2025-12-06 10:16:27.612177452 +0000 UTC m=+0.145800299 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:16:27 localhost podman[246528]: 2025-12-06 10:16:27.624142401 +0000 UTC m=+0.157765218 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:16:27 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.010 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Acquiring lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.011 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.011 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "83f60b2f-0c03-4557-a484-b92761cfecce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.037 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.038 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.038 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.038 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.136 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.180 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.181 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.240 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.242 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59684 DF PROTO=TCP SPT=58538 DPT=9102 SEQ=160817535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD7F5870000000001030307) Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.314 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.315 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.388 237285 DEBUG oslo_concurrency.processutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.533 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.623 237285 WARNING nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.627 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12322MB free_disk=387.26652908325195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.627 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.628 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.703 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Migration for instance 83f60b2f-0c03-4557-a484-b92761cfecce refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.759 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.799 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.800 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Migration d37c64e5-a5b0-4f07-89cf-68524547caa6 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.801 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.801 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.890 237285 DEBUG nova.compute.provider_tree [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.914 237285 DEBUG nova.scheduler.client.report [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.944 237285 DEBUG nova.compute.resource_tracker [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.945 237285 DEBUG oslo_concurrency.lockutils [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:28 localhost nova_compute[237281]: 2025-12-06 10:16:28.953 237285 INFO nova.compute.manager [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Migrating instance to np0005548801.ooo.test finished successfully.#033[00m Dec 6 05:16:29 localhost nova_compute[237281]: 2025-12-06 10:16:29.152 237285 INFO nova.scheduler.client.report [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] Deleted allocation for migration d37c64e5-a5b0-4f07-89cf-68524547caa6#033[00m Dec 6 05:16:29 localhost nova_compute[237281]: 2025-12-06 10:16:29.154 237285 DEBUG nova.virt.libvirt.driver [None req-78745910-8f77-45c3-a489-1201218a2706 04d1266ba62f49c5afc17d0bb11cd5cd 2f32feb968b74693a394964324f981bf - - default default] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m Dec 6 05:16:29 localhost nova_compute[237281]: 2025-12-06 10:16:29.775 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20864 DF PROTO=TCP SPT=42226 DPT=9102 SEQ=4149398504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD800870000000001030307) Dec 6 05:16:31 localhost nova_compute[237281]: 2025-12-06 10:16:31.475 237285 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:16:31 localhost nova_compute[237281]: 2025-12-06 10:16:31.476 237285 INFO nova.compute.manager [-] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] VM Stopped (Lifecycle Event)#033[00m Dec 6 05:16:31 localhost nova_compute[237281]: 2025-12-06 10:16:31.514 237285 DEBUG nova.compute.manager [None req-7ff7a0eb-df78-4a2f-b4bf-7b1517de8315 - - - - - -] [instance: 83f60b2f-0c03-4557-a484-b92761cfecce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:16:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:16:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:16:33 localhost nova_compute[237281]: 2025-12-06 10:16:33.536 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:33 localhost podman[246581]: 2025-12-06 10:16:33.550956898 +0000 UTC m=+0.084999914 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:16:33 localhost podman[246581]: 2025-12-06 10:16:33.559234793 +0000 UTC m=+0.093277859 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 05:16:33 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:16:33 localhost podman[246582]: 2025-12-06 10:16:33.65573476 +0000 UTC m=+0.187364401 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:16:33 localhost podman[246582]: 2025-12-06 10:16:33.691827434 +0000 UTC m=+0.223457105 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:16:33 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:16:34 localhost nova_compute[237281]: 2025-12-06 10:16:34.779 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:36 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:36.684 2 INFO neutron.agent.securitygroups_rpc [req-42e872d7-1052-4d20-b5b6-7b55c86d578a req-84aaf189-d56a-4630-a5bd-007f75806863 4dd0467bc60c4fcc8357de40a955e9e4 1b9a5cf38c7e4c52baa224d3c9813fb4 - - default default] Security group rule updated ['d9fa9f91-aa0c-43b6-a7c4-fb37a7515c30']#033[00m Dec 6 05:16:37 localhost ovn_controller[131684]: 2025-12-06T10:16:37Z|00108|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:16:37 localhost nova_compute[237281]: 2025-12-06 10:16:37.755 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:37 localhost nova_compute[237281]: 2025-12-06 10:16:37.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:38 localhost nova_compute[237281]: 2025-12-06 10:16:38.578 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20865 DF PROTO=TCP SPT=42226 DPT=9102 SEQ=4149398504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD821880000000001030307) Dec 6 05:16:39 localhost nova_compute[237281]: 2025-12-06 10:16:39.782 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:39 localhost nova_compute[237281]: 2025-12-06 10:16:39.907 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:40.503 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:16:40 localhost nova_compute[237281]: 2025-12-06 10:16:40.503 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:40.505 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:16:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:16:42 localhost podman[246620]: 2025-12-06 10:16:42.547708421 +0000 UTC m=+0.084260151 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41) Dec 6 05:16:42 localhost podman[246620]: 2025-12-06 10:16:42.565317545 +0000 UTC m=+0.101869255 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 6 05:16:42 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:16:43 localhost nova_compute[237281]: 2025-12-06 10:16:43.581 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:44 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:16:44.098 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:23Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=51683506-5f1b-40d6-b898-12a44611e923, ip_allocation=immediate, mac_address=fa:16:3e:4d:48:58, name=tempest-parent-745716054, network_id=9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, port_security_enabled=True, project_id=d4ffab9068e64ee89c49785c5f76ecd3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=16, security_groups=['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2'], standard_attr_id=367, status=DOWN, tags=[], tenant_id=d4ffab9068e64ee89c49785c5f76ecd3, trunk_details=sub_ports=[], trunk_id=dc442437-9f4f-446d-b004-588e5c07d3e0, updated_at=2025-12-06T10:16:43Z on network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9#033[00m Dec 6 05:16:44 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 2 addresses Dec 6 05:16:44 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:16:44 localhost podman[246656]: 2025-12-06 10:16:44.324555777 +0000 UTC m=+0.062991474 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:16:44 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:16:44 localhost systemd[1]: tmp-crun.vO0NId.mount: Deactivated successfully. Dec 6 05:16:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:16:44 localhost podman[246670]: 2025-12-06 10:16:44.442426814 +0000 UTC m=+0.082906039 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:16:44 localhost podman[246670]: 2025-12-06 10:16:44.452239426 +0000 UTC m=+0.092718661 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:16:44 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:16:44 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:16:44.685 219384 INFO neutron.agent.dhcp.agent [None req-fa9c9848-a9f4-4c52-b582-614235e5eb0e - - - - - -] DHCP configuration for ports {'51683506-5f1b-40d6-b898-12a44611e923'} is completed#033[00m Dec 6 05:16:44 localhost nova_compute[237281]: 2025-12-06 10:16:44.811 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:44 localhost nova_compute[237281]: 2025-12-06 10:16:44.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:44 localhost nova_compute[237281]: 2025-12-06 10:16:44.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:45 localhost nova_compute[237281]: 2025-12-06 10:16:45.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:45 localhost nova_compute[237281]: 2025-12-06 10:16:45.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:16:45 localhost snmpd[56894]: empty variable list in _query Dec 6 05:16:45 localhost snmpd[56894]: empty variable list in _query Dec 6 05:16:45 localhost snmpd[56894]: empty variable list in _query Dec 6 05:16:46 localhost openstack_network_exporter[199751]: ERROR 10:16:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:16:46 localhost openstack_network_exporter[199751]: ERROR 10:16:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:16:46 localhost openstack_network_exporter[199751]: ERROR 10:16:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:16:46 localhost openstack_network_exporter[199751]: ERROR 10:16:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:16:46 localhost openstack_network_exporter[199751]: Dec 6 05:16:46 localhost openstack_network_exporter[199751]: ERROR 10:16:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:16:46 localhost openstack_network_exporter[199751]: Dec 6 05:16:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:46.507 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:16:46 localhost nova_compute[237281]: 2025-12-06 10:16:46.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:46 localhost nova_compute[237281]: 2025-12-06 10:16:46.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:48 localhost nova_compute[237281]: 2025-12-06 10:16:48.638 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:48 localhost nova_compute[237281]: 2025-12-06 10:16:48.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:48 localhost nova_compute[237281]: 2025-12-06 10:16:48.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 05:16:49 localhost nova_compute[237281]: 2025-12-06 10:16:49.089 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 05:16:49 localhost nova_compute[237281]: 2025-12-06 10:16:49.855 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:50 localhost nova_compute[237281]: 2025-12-06 10:16:50.090 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:50 localhost nova_compute[237281]: 2025-12-06 10:16:50.091 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:16:50 localhost nova_compute[237281]: 2025-12-06 10:16:50.091 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:16:50 localhost nova_compute[237281]: 2025-12-06 10:16:50.659 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:16:50 localhost nova_compute[237281]: 2025-12-06 10:16:50.659 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:16:50 localhost nova_compute[237281]: 2025-12-06 10:16:50.660 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:16:50 localhost nova_compute[237281]: 2025-12-06 10:16:50.660 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:16:51 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:51.414 2 INFO neutron.agent.securitygroups_rpc [req-193ee4b9-5f0f-4204-a0f9-b91f84dacbef req-4faa9f0b-b3e3-4676-b400-d3d654ce80de 4dd0467bc60c4fcc8357de40a955e9e4 1b9a5cf38c7e4c52baa224d3c9813fb4 - - default default] Security group rule updated ['fe6c94e1-cad0-4c51-9225-626815d99501']#033[00m Dec 6 05:16:51 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:51.873 2 INFO neutron.agent.securitygroups_rpc [None req-072b8c35-238f-45ab-9934-ffad83657174 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Security group member updated ['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2']#033[00m Dec 6 05:16:52 localhost podman[246714]: 2025-12-06 10:16:52.183039396 +0000 UTC m=+0.057713491 container kill dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:16:52 localhost dnsmasq[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/addn_hosts - 0 addresses Dec 6 05:16:52 localhost dnsmasq-dhcp[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/host Dec 6 05:16:52 localhost dnsmasq-dhcp[245739]: read /var/lib/neutron/dhcp/ea5020da-f370-47b0-b6c4-a1f36329a7ad/opts Dec 6 05:16:53 localhost podman[197801]: time="2025-12-06T10:16:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:16:53 localhost podman[197801]: @ - - [06/Dec/2025:10:16:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149620 "" "Go-http-client/1.1" Dec 6 05:16:53 localhost podman[197801]: @ - - [06/Dec/2025:10:16:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17389 "" "Go-http-client/1.1" Dec 6 05:16:53 localhost nova_compute[237281]: 2025-12-06 10:16:53.641 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:53 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:53.658 2 INFO neutron.agent.securitygroups_rpc [req-bb0bc4a0-cc16-4fab-8226-cfe87f9e2d9a req-40d79712-684b-49d4-8979-2dfaed762b5e 4dd0467bc60c4fcc8357de40a955e9e4 1b9a5cf38c7e4c52baa224d3c9813fb4 - - default default] Security group rule updated ['fe6c94e1-cad0-4c51-9225-626815d99501']#033[00m Dec 6 05:16:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61217 DF PROTO=TCP SPT=53090 DPT=9102 SEQ=3782954089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD859FA0000000001030307) Dec 6 05:16:54 localhost dnsmasq[245739]: exiting on receipt of SIGTERM Dec 6 05:16:54 localhost podman[246751]: 2025-12-06 10:16:54.005963944 +0000 UTC m=+0.063307013 container kill dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:16:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:16:54 localhost systemd[1]: libpod-dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f.scope: Deactivated successfully. Dec 6 05:16:54 localhost podman[246765]: 2025-12-06 10:16:54.083309761 +0000 UTC m=+0.061586021 container died dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:16:54 localhost systemd[1]: tmp-crun.EPddUz.mount: Deactivated successfully. Dec 6 05:16:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f-userdata-shm.mount: Deactivated successfully. Dec 6 05:16:54 localhost podman[246765]: 2025-12-06 10:16:54.119386564 +0000 UTC m=+0.097662784 container cleanup dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:16:54 localhost systemd[1]: libpod-conmon-dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f.scope: Deactivated successfully. Dec 6 05:16:54 localhost podman[246767]: 2025-12-06 10:16:54.157718977 +0000 UTC m=+0.130501988 container remove dc08bc4438e7db9fd6566b92a5002ac1539930c5948b7769b4b2b820fc8aff2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ea5020da-f370-47b0-b6c4-a1f36329a7ad, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:16:54 localhost nova_compute[237281]: 2025-12-06 10:16:54.168 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:54 localhost kernel: device tapae282abb-34 left promiscuous mode Dec 6 05:16:54 localhost ovn_controller[131684]: 2025-12-06T10:16:54Z|00109|binding|INFO|Releasing lport ae282abb-349f-41f0-995b-0dbe9188b00f from this chassis (sb_readonly=0) Dec 6 05:16:54 localhost ovn_controller[131684]: 2025-12-06T10:16:54Z|00110|binding|INFO|Setting lport ae282abb-349f-41f0-995b-0dbe9188b00f down in Southbound Dec 6 05:16:54 localhost nova_compute[237281]: 2025-12-06 10:16:54.183 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:54 localhost nova_compute[237281]: 2025-12-06 10:16:54.186 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:54 localhost podman[246773]: 2025-12-06 10:16:54.22591847 +0000 UTC m=+0.189331782 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:16:54 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:54.242 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea5020da-f370-47b0-b6c4-a1f36329a7ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39fd89f7-b191-45fe-9a9a-73b9abccf180, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ae282abb-349f-41f0-995b-0dbe9188b00f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:16:54 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:54.245 137259 INFO neutron.agent.ovn.metadata.agent [-] Port ae282abb-349f-41f0-995b-0dbe9188b00f in datapath ea5020da-f370-47b0-b6c4-a1f36329a7ad unbound from our chassis#033[00m Dec 6 05:16:54 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:54.249 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea5020da-f370-47b0-b6c4-a1f36329a7ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:16:54 localhost ovn_metadata_agent[137254]: 2025-12-06 10:16:54.251 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5d3cef48-f0bf-4681-ac67-154e6b9cc235]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:16:54 localhost podman[246773]: 2025-12-06 10:16:54.263237731 +0000 UTC m=+0.226651083 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:16:54 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:16:54 localhost nova_compute[237281]: 2025-12-06 10:16:54.860 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:55 localhost systemd[1]: var-lib-containers-storage-overlay-3d12c313be28d53f2adece6783f26a7003d69099d210059d75e16e9fa0a9f1bc-merged.mount: Deactivated successfully. Dec 6 05:16:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61218 DF PROTO=TCP SPT=53090 DPT=9102 SEQ=3782954089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD85E070000000001030307) Dec 6 05:16:55 localhost systemd[1]: run-netns-qdhcp\x2dea5020da\x2df370\x2d47b0\x2db6c4\x2da1f36329a7ad.mount: Deactivated successfully. Dec 6 05:16:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:16:55.159 219384 INFO neutron.agent.dhcp.agent [None req-0ea922ba-2edb-409d-8a9f-04b5dc700e5e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:16:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:16:55.160 219384 INFO neutron.agent.dhcp.agent [None req-0ea922ba-2edb-409d-8a9f-04b5dc700e5e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:16:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20866 DF PROTO=TCP SPT=42226 DPT=9102 SEQ=4149398504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD861870000000001030307) Dec 6 05:16:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:16:55.968 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.490 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.523 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.523 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.524 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.525 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.671 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.672 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.672 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.673 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:16:56 localhost ovn_controller[131684]: 2025-12-06T10:16:56Z|00111|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.766 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.870 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.945 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:56 localhost nova_compute[237281]: 2025-12-06 10:16:56.946 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.001 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.003 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.046 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.047 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:16:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61219 DF PROTO=TCP SPT=53090 DPT=9102 SEQ=3782954089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD866070000000001030307) Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.126 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.300 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.302 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12324MB free_disk=387.2665824890137GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.302 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.302 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.977 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.978 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:16:57 localhost nova_compute[237281]: 2025-12-06 10:16:57.978 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:16:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51825 DF PROTO=TCP SPT=58764 DPT=9102 SEQ=4170017105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD869870000000001030307) Dec 6 05:16:58 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:58.002 2 INFO neutron.agent.securitygroups_rpc [None req-33a36224-09d9-4d85-83cc-b7cd51163845 21240df215a544878bd6d4c5ec47594a d4ffab9068e64ee89c49785c5f76ecd3 - - default default] Security group member updated ['5d02d56e-c6b2-47de-b5f5-53b4d437a7c2']#033[00m Dec 6 05:16:58 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 1 addresses Dec 6 05:16:58 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:16:58 localhost podman[246849]: 2025-12-06 10:16:58.392811341 +0000 UTC m=+0.060477496 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:16:58 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:16:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:16:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.438 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 05:16:58 localhost podman[246863]: 2025-12-06 10:16:58.516137056 +0000 UTC m=+0.086437167 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:16:58 localhost podman[246863]: 2025-12-06 10:16:58.52629806 +0000 UTC m=+0.096598191 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:16:58 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:16:58 localhost systemd[1]: tmp-crun.ezGk74.mount: Deactivated successfully. Dec 6 05:16:58 localhost podman[246864]: 2025-12-06 10:16:58.631884487 +0000 UTC m=+0.196858784 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.644 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:58 localhost podman[246864]: 2025-12-06 10:16:58.649528841 +0000 UTC m=+0.214503148 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:16:58 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.719 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.720 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.756 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.806 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 05:16:58 localhost neutron_sriov_agent[212548]: 2025-12-06 10:16:58.877 2 INFO neutron.agent.securitygroups_rpc [req-16c4e2e4-e165-432c-ab4d-c628d0422ef5 req-cd9f01b7-6f09-4e32-a61a-5502195ce6c2 4dd0467bc60c4fcc8357de40a955e9e4 1b9a5cf38c7e4c52baa224d3c9813fb4 - - default default] Security group rule updated ['fe6c94e1-cad0-4c51-9225-626815d99501']#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.881 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.900 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.903 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.904 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.904 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:16:58 localhost nova_compute[237281]: 2025-12-06 10:16:58.905 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 05:16:59 localhost nova_compute[237281]: 2025-12-06 10:16:59.268 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:16:59 localhost nova_compute[237281]: 2025-12-06 10:16:59.863 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:00 localhost nova_compute[237281]: 2025-12-06 10:17:00.041 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:00 localhost dnsmasq[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/addn_hosts - 0 addresses Dec 6 05:17:00 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/host Dec 6 05:17:00 localhost podman[246927]: 2025-12-06 10:17:00.0582283 +0000 UTC m=+0.063703456 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:17:00 localhost dnsmasq-dhcp[245197]: read /var/lib/neutron/dhcp/9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9/opts Dec 6 05:17:00 localhost ovn_controller[131684]: 2025-12-06T10:17:00Z|00112|binding|INFO|Releasing lport e48b7399-3b56-4775-9776-caf16d42ea9c from this chassis (sb_readonly=0) Dec 6 05:17:00 localhost kernel: device tape48b7399-3b left promiscuous mode Dec 6 05:17:00 localhost ovn_controller[131684]: 2025-12-06T10:17:00Z|00113|binding|INFO|Setting lport e48b7399-3b56-4775-9776-caf16d42ea9c down in Southbound Dec 6 05:17:00 localhost nova_compute[237281]: 2025-12-06 10:17:00.597 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:00 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:00.615 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4ffab9068e64ee89c49785c5f76ecd3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28349eaf-1dbf-4bc7-ae61-616696dcb1a3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e48b7399-3b56-4775-9776-caf16d42ea9c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:17:00 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:00.617 137259 INFO neutron.agent.ovn.metadata.agent [-] Port e48b7399-3b56-4775-9776-caf16d42ea9c in datapath 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9 unbound from our chassis#033[00m Dec 6 05:17:00 localhost nova_compute[237281]: 2025-12-06 10:17:00.621 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:00 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:00.622 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:17:00 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:00.623 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d047a5-ef1d-4745-b3d2-c478a8e9f120]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:17:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61220 DF PROTO=TCP SPT=53090 DPT=9102 SEQ=3782954089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD875C70000000001030307) Dec 6 05:17:03 localhost nova_compute[237281]: 2025-12-06 10:17:03.647 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:17:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:17:04 localhost podman[246950]: 2025-12-06 10:17:04.558906868 +0000 UTC m=+0.085426256 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm) Dec 6 05:17:04 localhost podman[246950]: 2025-12-06 10:17:04.575193681 +0000 UTC m=+0.101713109 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true) Dec 6 05:17:04 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:17:04 localhost podman[246949]: 2025-12-06 10:17:04.67078995 +0000 UTC m=+0.194892723 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:17:04 localhost podman[246949]: 2025-12-06 10:17:04.706503732 +0000 UTC m=+0.230606495 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:17:04 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:17:04 localhost nova_compute[237281]: 2025-12-06 10:17:04.894 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:06 localhost nova_compute[237281]: 2025-12-06 10:17:06.351 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:06.700 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:17:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:06.701 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:17:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:06.701 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:17:08 localhost ovn_controller[131684]: 2025-12-06T10:17:08Z|00114|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:08 localhost nova_compute[237281]: 2025-12-06 10:17:08.445 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:08 localhost nova_compute[237281]: 2025-12-06 10:17:08.649 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:08 localhost nova_compute[237281]: 2025-12-06 10:17:08.807 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61221 DF PROTO=TCP SPT=53090 DPT=9102 SEQ=3782954089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD895870000000001030307) Dec 6 05:17:09 localhost podman[247001]: 2025-12-06 10:17:09.315693438 +0000 UTC m=+0.059480816 container kill 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:17:09 localhost dnsmasq[245197]: exiting on receipt of SIGTERM Dec 6 05:17:09 localhost systemd[1]: libpod-0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0.scope: Deactivated successfully. Dec 6 05:17:09 localhost podman[247015]: 2025-12-06 10:17:09.389470884 +0000 UTC m=+0.058917448 container died 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:17:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0-userdata-shm.mount: Deactivated successfully. Dec 6 05:17:09 localhost systemd[1]: var-lib-containers-storage-overlay-05a6bbf6a87d11a772fb4b237ab8126c3b1907e544497438af13e31f78927959-merged.mount: Deactivated successfully. Dec 6 05:17:09 localhost podman[247015]: 2025-12-06 10:17:09.42596141 +0000 UTC m=+0.095407934 container cleanup 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:17:09 localhost systemd[1]: libpod-conmon-0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0.scope: Deactivated successfully. Dec 6 05:17:09 localhost sshd[247041]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:17:09 localhost podman[247017]: 2025-12-06 10:17:09.487586271 +0000 UTC m=+0.144752937 container remove 0d712da0e69d756bd77e1b9b1583e499089671b34081fd07b9947a0dfd0919a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9d3dc0c9-c5ba-4357-8485-3e7e8def3fd9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:17:09 localhost nova_compute[237281]: 2025-12-06 10:17:09.803 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:09 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:09.858 219384 INFO neutron.agent.dhcp.agent [None req-15706aef-ab37-46b8-aaed-6d046af4d36d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:09 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:09.859 219384 INFO neutron.agent.dhcp.agent [None req-15706aef-ab37-46b8-aaed-6d046af4d36d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:09 localhost nova_compute[237281]: 2025-12-06 10:17:09.895 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:10.156 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:10 localhost systemd[1]: run-netns-qdhcp\x2d9d3dc0c9\x2dc5ba\x2d4357\x2d8485\x2d3e7e8def3fd9.mount: Deactivated successfully. Dec 6 05:17:10 localhost neutron_sriov_agent[212548]: 2025-12-06 10:17:10.882 2 INFO neutron.agent.securitygroups_rpc [req-a59d7aaf-23f2-4b87-80a3-7ee07a980fe4 req-a2088af3-2ad7-432c-bd69-884c457b049f 1528bfcf45fa48d1b3774bad40c11e11 3eaea51aeb6e45d8b2b6b0d628efdd66 - - default default] Security group rule updated ['0dc1cd05-44db-4d4e-9f28-c104c6dfb4c0']#033[00m Dec 6 05:17:12 localhost podman[247061]: 2025-12-06 10:17:12.769929963 +0000 UTC m=+0.072606890 container kill e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 05:17:12 localhost dnsmasq[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/addn_hosts - 0 addresses Dec 6 05:17:12 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/host Dec 6 05:17:12 localhost dnsmasq-dhcp[245457]: read /var/lib/neutron/dhcp/9e292c11-f020-42d8-9e52-a14ca36d70ab/opts Dec 6 05:17:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:17:12 localhost podman[247076]: 2025-12-06 10:17:12.887408357 +0000 UTC m=+0.084079445 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible) Dec 6 05:17:12 localhost podman[247076]: 2025-12-06 10:17:12.904442704 +0000 UTC m=+0.101113772 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:17:12 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:17:13 localhost kernel: device tapc4f64f5d-e8 left promiscuous mode Dec 6 05:17:13 localhost nova_compute[237281]: 2025-12-06 10:17:13.146 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:13 localhost ovn_controller[131684]: 2025-12-06T10:17:13Z|00115|binding|INFO|Releasing lport c4f64f5d-e8fe-4f9a-8027-e9ea7228536e from this chassis (sb_readonly=0) Dec 6 05:17:13 localhost ovn_controller[131684]: 2025-12-06T10:17:13Z|00116|binding|INFO|Setting lport c4f64f5d-e8fe-4f9a-8027-e9ea7228536e down in Southbound Dec 6 05:17:13 localhost ovn_controller[131684]: 2025-12-06T10:17:13Z|00117|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:13.160 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9e292c11-f020-42d8-9e52-a14ca36d70ab', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e292c11-f020-42d8-9e52-a14ca36d70ab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f32feb968b74693a394964324f981bf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e98fa4-639b-44b6-8829-27eb9ee109d4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c4f64f5d-e8fe-4f9a-8027-e9ea7228536e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:17:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:13.163 137259 INFO neutron.agent.ovn.metadata.agent [-] Port c4f64f5d-e8fe-4f9a-8027-e9ea7228536e in datapath 9e292c11-f020-42d8-9e52-a14ca36d70ab unbound from our chassis#033[00m Dec 6 05:17:13 localhost nova_compute[237281]: 2025-12-06 10:17:13.163 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:13.166 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e292c11-f020-42d8-9e52-a14ca36d70ab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:17:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:13.167 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5740e0d4-aec3-4656-a260-a88762436897]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:17:13 localhost nova_compute[237281]: 2025-12-06 10:17:13.181 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:13 localhost nova_compute[237281]: 2025-12-06 10:17:13.188 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:13 localhost nova_compute[237281]: 2025-12-06 10:17:13.651 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:14 localhost nova_compute[237281]: 2025-12-06 10:17:14.898 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:17:15 localhost systemd[1]: tmp-crun.3lnQ98.mount: Deactivated successfully. Dec 6 05:17:15 localhost podman[247103]: 2025-12-06 10:17:15.55519122 +0000 UTC m=+0.087884483 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:17:15 localhost podman[247103]: 2025-12-06 10:17:15.56559433 +0000 UTC m=+0.098287563 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:17:15 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:17:15 localhost sshd[247128]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:17:16 localhost openstack_network_exporter[199751]: ERROR 10:17:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:17:16 localhost openstack_network_exporter[199751]: ERROR 10:17:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:17:16 localhost openstack_network_exporter[199751]: ERROR 10:17:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:17:16 localhost openstack_network_exporter[199751]: ERROR 10:17:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:17:16 localhost openstack_network_exporter[199751]: Dec 6 05:17:16 localhost openstack_network_exporter[199751]: ERROR 10:17:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:17:16 localhost openstack_network_exporter[199751]: Dec 6 05:17:17 localhost nova_compute[237281]: 2025-12-06 10:17:17.571 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:18 localhost ovn_controller[131684]: 2025-12-06T10:17:18Z|00118|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:18 localhost nova_compute[237281]: 2025-12-06 10:17:18.282 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:17:18.468 2 INFO neutron.agent.securitygroups_rpc [req-ddfc61bd-f01b-47ff-9dd7-3afa59379f24 req-2e98e37b-5d24-451b-a2bf-63d76508720c 1528bfcf45fa48d1b3774bad40c11e11 3eaea51aeb6e45d8b2b6b0d628efdd66 - - default default] Security group rule updated ['0dc1cd05-44db-4d4e-9f28-c104c6dfb4c0']#033[00m Dec 6 05:17:18 localhost nova_compute[237281]: 2025-12-06 10:17:18.682 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:18 localhost systemd[1]: tmp-crun.ga5CTW.mount: Deactivated successfully. Dec 6 05:17:18 localhost dnsmasq[245457]: exiting on receipt of SIGTERM Dec 6 05:17:18 localhost podman[247147]: 2025-12-06 10:17:18.936418483 +0000 UTC m=+0.074411667 container kill e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:17:18 localhost systemd[1]: libpod-e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e.scope: Deactivated successfully. Dec 6 05:17:19 localhost podman[247160]: 2025-12-06 10:17:19.012757628 +0000 UTC m=+0.062692675 container died e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:17:19 localhost podman[247160]: 2025-12-06 10:17:19.045195799 +0000 UTC m=+0.095130826 container cleanup e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:17:19 localhost systemd[1]: libpod-conmon-e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e.scope: Deactivated successfully. Dec 6 05:17:19 localhost podman[247162]: 2025-12-06 10:17:19.099751812 +0000 UTC m=+0.139280988 container remove e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e292c11-f020-42d8-9e52-a14ca36d70ab, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:17:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:19.131 219384 INFO neutron.agent.dhcp.agent [None req-0ffb7785-aaf7-4efc-8865-a165e8cb0cc7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:19.323 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:19 localhost nova_compute[237281]: 2025-12-06 10:17:19.902 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:19 localhost systemd[1]: var-lib-containers-storage-overlay-43cf96b82e564114a6015e338ea9afa82422eaed58aefc2b2331a1d87142e145-merged.mount: Deactivated successfully. Dec 6 05:17:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5da51ae508db62b6581cbe9cc58ff185953f911795c4218768d216a2d80893e-userdata-shm.mount: Deactivated successfully. Dec 6 05:17:19 localhost systemd[1]: run-netns-qdhcp\x2d9e292c11\x2df020\x2d42d8\x2d9e52\x2da14ca36d70ab.mount: Deactivated successfully. Dec 6 05:17:20 localhost ovn_controller[131684]: 2025-12-06T10:17:20Z|00119|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:20 localhost nova_compute[237281]: 2025-12-06 10:17:20.611 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:23 localhost podman[197801]: time="2025-12-06T10:17:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:17:23 localhost podman[197801]: @ - - [06/Dec/2025:10:17:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:17:23 localhost podman[197801]: @ - - [06/Dec/2025:10:17:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15962 "" "Go-http-client/1.1" Dec 6 05:17:23 localhost nova_compute[237281]: 2025-12-06 10:17:23.685 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2204 DF PROTO=TCP SPT=56118 DPT=9102 SEQ=229835423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD8CF290000000001030307) Dec 6 05:17:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:17:24 localhost podman[247190]: 2025-12-06 10:17:24.573401357 +0000 UTC m=+0.102085091 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 05:17:24 localhost podman[247190]: 2025-12-06 10:17:24.652367253 +0000 UTC m=+0.181050977 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true) Dec 6 05:17:24 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:17:24 localhost ovn_controller[131684]: 2025-12-06T10:17:24Z|00120|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:24 localhost nova_compute[237281]: 2025-12-06 10:17:24.905 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:24 localhost nova_compute[237281]: 2025-12-06 10:17:24.937 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2205 DF PROTO=TCP SPT=56118 DPT=9102 SEQ=229835423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD8D3470000000001030307) Dec 6 05:17:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61222 DF PROTO=TCP SPT=53090 DPT=9102 SEQ=3782954089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD8D5870000000001030307) Dec 6 05:17:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2206 DF PROTO=TCP SPT=56118 DPT=9102 SEQ=229835423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD8DB480000000001030307) Dec 6 05:17:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20867 DF PROTO=TCP SPT=42226 DPT=9102 SEQ=4149398504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD8DF880000000001030307) Dec 6 05:17:28 localhost nova_compute[237281]: 2025-12-06 10:17:28.688 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:17:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:17:29 localhost podman[247215]: 2025-12-06 10:17:29.557225561 +0000 UTC m=+0.088226803 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:17:29 localhost podman[247215]: 2025-12-06 10:17:29.59835098 +0000 UTC m=+0.129352242 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3) Dec 6 05:17:29 localhost podman[247214]: 2025-12-06 10:17:29.610834375 +0000 UTC m=+0.141643681 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:17:29 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:17:29 localhost podman[247214]: 2025-12-06 10:17:29.615676904 +0000 UTC m=+0.146486200 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:17:29 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:17:29 localhost nova_compute[237281]: 2025-12-06 10:17:29.926 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2207 DF PROTO=TCP SPT=56118 DPT=9102 SEQ=229835423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD8EB070000000001030307) Dec 6 05:17:32 localhost ovn_controller[131684]: 2025-12-06T10:17:32Z|00121|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:32 localhost nova_compute[237281]: 2025-12-06 10:17:32.595 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:32.610 219384 INFO neutron.agent.linux.ip_lib [None req-ef629b44-3a10-49a2-a877-b7a0bc1c099d - - - - - -] Device tapc148b4ef-b6 cannot be used as it has no MAC address#033[00m Dec 6 05:17:32 localhost nova_compute[237281]: 2025-12-06 10:17:32.639 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:32 localhost kernel: device tapc148b4ef-b6 entered promiscuous mode Dec 6 05:17:32 localhost NetworkManager[5965]: [1765016252.6488] manager: (tapc148b4ef-b6): new Generic device (/org/freedesktop/NetworkManager/Devices/24) Dec 6 05:17:32 localhost ovn_controller[131684]: 2025-12-06T10:17:32Z|00122|binding|INFO|Claiming lport c148b4ef-b64b-4c05-9ac4-6ee60efbc89b for this chassis. Dec 6 05:17:32 localhost ovn_controller[131684]: 2025-12-06T10:17:32Z|00123|binding|INFO|c148b4ef-b64b-4c05-9ac4-6ee60efbc89b: Claiming unknown Dec 6 05:17:32 localhost nova_compute[237281]: 2025-12-06 10:17:32.652 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:32 localhost systemd-udevd[247265]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:17:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:32.663 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-c182d686-67a7-43f3-b039-29f492fe4232', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c182d686-67a7-43f3-b039-29f492fe4232', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5364bf2edeaf41ae9d4248e56b5ec33d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9911cb6-21a7-4062-a48f-50af80ca1cac, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c148b4ef-b64b-4c05-9ac4-6ee60efbc89b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:17:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:32.666 137259 INFO neutron.agent.ovn.metadata.agent [-] Port c148b4ef-b64b-4c05-9ac4-6ee60efbc89b in datapath c182d686-67a7-43f3-b039-29f492fe4232 bound to our chassis#033[00m Dec 6 05:17:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:32.668 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port c0baa25f-9191-4009-a79e-809d1fae655f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:17:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:32.669 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c182d686-67a7-43f3-b039-29f492fe4232, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:17:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:32.670 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3406ea-abed-49e7-9e8c-76e2f9582df3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost ovn_controller[131684]: 2025-12-06T10:17:32Z|00124|binding|INFO|Setting lport c148b4ef-b64b-4c05-9ac4-6ee60efbc89b ovn-installed in OVS Dec 6 05:17:32 localhost ovn_controller[131684]: 2025-12-06T10:17:32Z|00125|binding|INFO|Setting lport c148b4ef-b64b-4c05-9ac4-6ee60efbc89b up in Southbound Dec 6 05:17:32 localhost nova_compute[237281]: 2025-12-06 10:17:32.682 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost journal[186952]: ethtool ioctl error on tapc148b4ef-b6: No such device Dec 6 05:17:32 localhost nova_compute[237281]: 2025-12-06 10:17:32.734 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:32 localhost nova_compute[237281]: 2025-12-06 10:17:32.767 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:33 localhost nova_compute[237281]: 2025-12-06 10:17:33.691 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:33 localhost podman[247336]: Dec 6 05:17:33 localhost podman[247336]: 2025-12-06 10:17:33.751334122 +0000 UTC m=+0.100574364 container create 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:17:33 localhost systemd[1]: Started libpod-conmon-27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294.scope. Dec 6 05:17:33 localhost podman[247336]: 2025-12-06 10:17:33.703084773 +0000 UTC m=+0.052325015 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:17:33 localhost systemd[1]: Started libcrun container. Dec 6 05:17:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b1b7f0c34752d12631a3103e7222b2bfc56f6a3e52326353332e818d38381a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:17:33 localhost podman[247336]: 2025-12-06 10:17:33.833581389 +0000 UTC m=+0.182821641 container init 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:17:33 localhost podman[247336]: 2025-12-06 10:17:33.843757033 +0000 UTC m=+0.192997285 container start 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:17:33 localhost dnsmasq[247354]: started, version 2.85 cachesize 150 Dec 6 05:17:33 localhost dnsmasq[247354]: DNS service limited to local subnets Dec 6 05:17:33 localhost dnsmasq[247354]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:17:33 localhost dnsmasq[247354]: warning: no upstream servers configured Dec 6 05:17:33 localhost dnsmasq-dhcp[247354]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:17:33 localhost dnsmasq[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/addn_hosts - 0 addresses Dec 6 05:17:33 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/host Dec 6 05:17:33 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/opts Dec 6 05:17:34 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:34.019 219384 INFO neutron.agent.dhcp.agent [None req-ca288eed-74d4-409a-b80f-f2482a1b91de - - - - - -] DHCP configuration for ports {'c82475d2-49d9-4b4d-88f7-3926929ba067'} is completed#033[00m Dec 6 05:17:34 localhost nova_compute[237281]: 2025-12-06 10:17:34.059 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:34 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:34.969 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:34Z, description=, device_id=2fd8d5f9-4a71-414a-906a-498e584e34d7, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f72646c6-b234-459e-aefb-6f39a13ca5d4, ip_allocation=immediate, mac_address=fa:16:3e:36:3b:a3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:29Z, description=, dns_domain=, id=c182d686-67a7-43f3-b039-29f492fe4232, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-1135061544-network, port_security_enabled=True, project_id=5364bf2edeaf41ae9d4248e56b5ec33d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37843, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=694, status=ACTIVE, subnets=['4f030c9c-9813-4fdb-be06-5f90a8794658'], tags=[], tenant_id=5364bf2edeaf41ae9d4248e56b5ec33d, updated_at=2025-12-06T10:17:30Z, vlan_transparent=None, network_id=c182d686-67a7-43f3-b039-29f492fe4232, port_security_enabled=False, project_id=5364bf2edeaf41ae9d4248e56b5ec33d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=728, status=DOWN, tags=[], tenant_id=5364bf2edeaf41ae9d4248e56b5ec33d, updated_at=2025-12-06T10:17:34Z on network c182d686-67a7-43f3-b039-29f492fe4232#033[00m Dec 6 05:17:34 localhost nova_compute[237281]: 2025-12-06 10:17:34.970 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:35 localhost podman[247370]: 2025-12-06 10:17:35.247308943 +0000 UTC m=+0.054789831 container kill 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:17:35 localhost dnsmasq[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/addn_hosts - 1 addresses Dec 6 05:17:35 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/host Dec 6 05:17:35 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/opts Dec 6 05:17:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:17:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:17:35 localhost podman[247382]: 2025-12-06 10:17:35.35969011 +0000 UTC m=+0.083019512 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Dec 6 05:17:35 localhost podman[247382]: 2025-12-06 10:17:35.367536642 +0000 UTC m=+0.090866074 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Dec 6 05:17:35 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:17:35 localhost podman[247384]: 2025-12-06 10:17:35.433105565 +0000 UTC m=+0.153278659 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS) Dec 6 05:17:35 localhost podman[247384]: 2025-12-06 10:17:35.451348248 +0000 UTC m=+0.171521312 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:17:35 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:17:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:35.490 219384 INFO neutron.agent.linux.ip_lib [None req-f5eb0e04-1b0c-4a31-b8c9-0330ea034f6f - - - - - -] Device tap9a858631-01 cannot be used as it has no MAC address#033[00m Dec 6 05:17:35 localhost nova_compute[237281]: 2025-12-06 10:17:35.520 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:35 localhost kernel: device tap9a858631-01 entered promiscuous mode Dec 6 05:17:35 localhost NetworkManager[5965]: [1765016255.5286] manager: (tap9a858631-01): new Generic device (/org/freedesktop/NetworkManager/Devices/25) Dec 6 05:17:35 localhost ovn_controller[131684]: 2025-12-06T10:17:35Z|00126|binding|INFO|Claiming lport 9a858631-0135-4c4b-afc8-daa9607df776 for this chassis. Dec 6 05:17:35 localhost ovn_controller[131684]: 2025-12-06T10:17:35Z|00127|binding|INFO|9a858631-0135-4c4b-afc8-daa9607df776: Claiming unknown Dec 6 05:17:35 localhost nova_compute[237281]: 2025-12-06 10:17:35.531 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:35.549 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2aeed871fc4143d987afbd49f963ad0b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03e4fdb9-16ae-470e-934f-3dd544200d9a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9a858631-0135-4c4b-afc8-daa9607df776) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:17:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:35.553 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 9a858631-0135-4c4b-afc8-daa9607df776 in datapath a4529faa-7a12-4a4a-8c72-70fbd8ccd1af bound to our chassis#033[00m Dec 6 05:17:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:35.556 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port cf28afd5-e009-4c5e-a1b8-7f927f9ac140 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:17:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:35.556 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:17:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:35.557 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[90c95a35-5680-4648-a08d-f81a0017af83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost ovn_controller[131684]: 2025-12-06T10:17:35Z|00128|binding|INFO|Setting lport 9a858631-0135-4c4b-afc8-daa9607df776 ovn-installed in OVS Dec 6 05:17:35 localhost ovn_controller[131684]: 2025-12-06T10:17:35Z|00129|binding|INFO|Setting lport 9a858631-0135-4c4b-afc8-daa9607df776 up in Southbound Dec 6 05:17:35 localhost nova_compute[237281]: 2025-12-06 10:17:35.567 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost journal[186952]: ethtool ioctl error on tap9a858631-01: No such device Dec 6 05:17:35 localhost nova_compute[237281]: 2025-12-06 10:17:35.616 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:35.618 219384 INFO neutron.agent.dhcp.agent [None req-1cd31073-080c-4fdf-8daf-960c0786e553 - - - - - -] DHCP configuration for ports {'f72646c6-b234-459e-aefb-6f39a13ca5d4'} is completed#033[00m Dec 6 05:17:35 localhost nova_compute[237281]: 2025-12-06 10:17:35.651 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:36 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:36.091 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:34Z, description=, device_id=2fd8d5f9-4a71-414a-906a-498e584e34d7, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f72646c6-b234-459e-aefb-6f39a13ca5d4, ip_allocation=immediate, mac_address=fa:16:3e:36:3b:a3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:29Z, description=, dns_domain=, id=c182d686-67a7-43f3-b039-29f492fe4232, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-1135061544-network, port_security_enabled=True, project_id=5364bf2edeaf41ae9d4248e56b5ec33d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37843, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=694, status=ACTIVE, subnets=['4f030c9c-9813-4fdb-be06-5f90a8794658'], tags=[], tenant_id=5364bf2edeaf41ae9d4248e56b5ec33d, updated_at=2025-12-06T10:17:30Z, vlan_transparent=None, network_id=c182d686-67a7-43f3-b039-29f492fe4232, port_security_enabled=False, project_id=5364bf2edeaf41ae9d4248e56b5ec33d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=728, status=DOWN, tags=[], tenant_id=5364bf2edeaf41ae9d4248e56b5ec33d, updated_at=2025-12-06T10:17:34Z on network c182d686-67a7-43f3-b039-29f492fe4232#033[00m Dec 6 05:17:36 localhost dnsmasq[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/addn_hosts - 1 addresses Dec 6 05:17:36 localhost podman[247496]: 2025-12-06 10:17:36.300773803 +0000 UTC m=+0.055785062 container kill 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:17:36 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/host Dec 6 05:17:36 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/opts Dec 6 05:17:36 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:36.537 219384 INFO neutron.agent.dhcp.agent [None req-d350651b-48ea-457f-a82e-f6457e0bb9a8 - - - - - -] DHCP configuration for ports {'f72646c6-b234-459e-aefb-6f39a13ca5d4'} is completed#033[00m Dec 6 05:17:36 localhost podman[247546]: Dec 6 05:17:36 localhost podman[247546]: 2025-12-06 10:17:36.809190878 +0000 UTC m=+0.095238119 container create 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:17:36 localhost systemd[1]: Started libpod-conmon-2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226.scope. Dec 6 05:17:36 localhost podman[247546]: 2025-12-06 10:17:36.764692736 +0000 UTC m=+0.050740047 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:17:36 localhost systemd[1]: Started libcrun container. Dec 6 05:17:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49ca37e8ee1fddec698c72233c2fea1c25ae9ef5d9e8ec35857481ac9208be9c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:17:36 localhost podman[247546]: 2025-12-06 10:17:36.900037401 +0000 UTC m=+0.186084662 container init 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:17:36 localhost podman[247546]: 2025-12-06 10:17:36.911992399 +0000 UTC m=+0.198039660 container start 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:17:36 localhost dnsmasq[247565]: started, version 2.85 cachesize 150 Dec 6 05:17:36 localhost dnsmasq[247565]: DNS service limited to local subnets Dec 6 05:17:36 localhost dnsmasq[247565]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:17:36 localhost dnsmasq[247565]: warning: no upstream servers configured Dec 6 05:17:36 localhost dnsmasq-dhcp[247565]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:17:36 localhost dnsmasq[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/addn_hosts - 0 addresses Dec 6 05:17:36 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/host Dec 6 05:17:36 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/opts Dec 6 05:17:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:37.108 219384 INFO neutron.agent.dhcp.agent [None req-e0dcaf4e-f645-4df1-a18d-c6c6d712abfb - - - - - -] DHCP configuration for ports {'4c4f69e6-76bd-4b09-b9b0-e48085a9099c'} is completed#033[00m Dec 6 05:17:38 localhost nova_compute[237281]: 2025-12-06 10:17:38.058 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:38 localhost nova_compute[237281]: 2025-12-06 10:17:38.730 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2208 DF PROTO=TCP SPT=56118 DPT=9102 SEQ=229835423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD90B880000000001030307) Dec 6 05:17:39 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:39.453 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:39Z, description=, device_id=ab1046d2-1b5f-4205-b504-d9fead83a2df, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7ba153d2-7d28-471d-8dac-64dd22c5adb4, ip_allocation=immediate, mac_address=fa:16:3e:25:51:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:32Z, description=, dns_domain=, id=a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-877282728-network, port_security_enabled=True, project_id=2aeed871fc4143d987afbd49f963ad0b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33168, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=717, status=ACTIVE, subnets=['0262676d-4973-4687-acbf-a798ccd902ae'], tags=[], tenant_id=2aeed871fc4143d987afbd49f963ad0b, updated_at=2025-12-06T10:17:33Z, vlan_transparent=None, network_id=a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, port_security_enabled=False, project_id=2aeed871fc4143d987afbd49f963ad0b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=743, status=DOWN, tags=[], tenant_id=2aeed871fc4143d987afbd49f963ad0b, updated_at=2025-12-06T10:17:39Z on network a4529faa-7a12-4a4a-8c72-70fbd8ccd1af#033[00m Dec 6 05:17:39 localhost podman[247583]: 2025-12-06 10:17:39.725859259 +0000 UTC m=+0.077173823 container kill 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:17:39 localhost systemd[1]: tmp-crun.jcXcQ1.mount: Deactivated successfully. Dec 6 05:17:39 localhost dnsmasq[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/addn_hosts - 1 addresses Dec 6 05:17:39 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/host Dec 6 05:17:39 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/opts Dec 6 05:17:40 localhost nova_compute[237281]: 2025-12-06 10:17:40.014 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:40.014 219384 INFO neutron.agent.dhcp.agent [None req-be736d82-1e94-48f1-8bf3-902368f02fea - - - - - -] DHCP configuration for ports {'7ba153d2-7d28-471d-8dac-64dd22c5adb4'} is completed#033[00m Dec 6 05:17:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:40.722 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:39Z, description=, device_id=ab1046d2-1b5f-4205-b504-d9fead83a2df, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7ba153d2-7d28-471d-8dac-64dd22c5adb4, ip_allocation=immediate, mac_address=fa:16:3e:25:51:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:32Z, description=, dns_domain=, id=a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-877282728-network, port_security_enabled=True, project_id=2aeed871fc4143d987afbd49f963ad0b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33168, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=717, status=ACTIVE, subnets=['0262676d-4973-4687-acbf-a798ccd902ae'], tags=[], tenant_id=2aeed871fc4143d987afbd49f963ad0b, updated_at=2025-12-06T10:17:33Z, vlan_transparent=None, network_id=a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, port_security_enabled=False, project_id=2aeed871fc4143d987afbd49f963ad0b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=743, status=DOWN, tags=[], tenant_id=2aeed871fc4143d987afbd49f963ad0b, updated_at=2025-12-06T10:17:39Z on network a4529faa-7a12-4a4a-8c72-70fbd8ccd1af#033[00m Dec 6 05:17:40 localhost ovn_controller[131684]: 2025-12-06T10:17:40Z|00130|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:40 localhost nova_compute[237281]: 2025-12-06 10:17:40.837 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:40 localhost podman[247622]: 2025-12-06 10:17:40.969002361 +0000 UTC m=+0.062655365 container kill 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:17:40 localhost dnsmasq[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/addn_hosts - 1 addresses Dec 6 05:17:40 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/host Dec 6 05:17:40 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/opts Dec 6 05:17:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:41.274 219384 INFO neutron.agent.dhcp.agent [None req-8dac8490-cf36-401f-910e-fbe43a4823a9 - - - - - -] DHCP configuration for ports {'7ba153d2-7d28-471d-8dac-64dd22c5adb4'} is completed#033[00m Dec 6 05:17:42 localhost nova_compute[237281]: 2025-12-06 10:17:42.288 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:42.367 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:17:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:42.368 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:17:42 localhost nova_compute[237281]: 2025-12-06 10:17:42.393 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:17:43 localhost podman[247642]: 2025-12-06 10:17:43.542100201 +0000 UTC m=+0.075839650 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, architecture=x86_64) Dec 6 05:17:43 localhost podman[247642]: 2025-12-06 10:17:43.561363916 +0000 UTC m=+0.095103415 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.) Dec 6 05:17:43 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:17:43 localhost nova_compute[237281]: 2025-12-06 10:17:43.770 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:45 localhost nova_compute[237281]: 2025-12-06 10:17:45.021 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:45 localhost nova_compute[237281]: 2025-12-06 10:17:45.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:46 localhost openstack_network_exporter[199751]: ERROR 10:17:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:17:46 localhost openstack_network_exporter[199751]: ERROR 10:17:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:17:46 localhost openstack_network_exporter[199751]: ERROR 10:17:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:17:46 localhost openstack_network_exporter[199751]: ERROR 10:17:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:17:46 localhost openstack_network_exporter[199751]: Dec 6 05:17:46 localhost openstack_network_exporter[199751]: ERROR 10:17:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:17:46 localhost openstack_network_exporter[199751]: Dec 6 05:17:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:17:46 localhost podman[247662]: 2025-12-06 10:17:46.559168739 +0000 UTC m=+0.089153452 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:17:46 localhost podman[247662]: 2025-12-06 10:17:46.594174729 +0000 UTC m=+0.124159442 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:17:46 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:17:46 localhost nova_compute[237281]: 2025-12-06 10:17:46.882 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:46 localhost nova_compute[237281]: 2025-12-06 10:17:46.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:46 localhost nova_compute[237281]: 2025-12-06 10:17:46.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:46 localhost nova_compute[237281]: 2025-12-06 10:17:46.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:17:47 localhost dnsmasq[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/addn_hosts - 0 addresses Dec 6 05:17:47 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/host Dec 6 05:17:47 localhost dnsmasq-dhcp[247354]: read /var/lib/neutron/dhcp/c182d686-67a7-43f3-b039-29f492fe4232/opts Dec 6 05:17:47 localhost podman[247702]: 2025-12-06 10:17:47.667015317 +0000 UTC m=+0.059539539 container kill 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:17:48 localhost nova_compute[237281]: 2025-12-06 10:17:48.017 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:48 localhost kernel: device tapc148b4ef-b6 left promiscuous mode Dec 6 05:17:48 localhost ovn_controller[131684]: 2025-12-06T10:17:48Z|00131|binding|INFO|Releasing lport c148b4ef-b64b-4c05-9ac4-6ee60efbc89b from this chassis (sb_readonly=0) Dec 6 05:17:48 localhost ovn_controller[131684]: 2025-12-06T10:17:48Z|00132|binding|INFO|Setting lport c148b4ef-b64b-4c05-9ac4-6ee60efbc89b down in Southbound Dec 6 05:17:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:48.029 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-c182d686-67a7-43f3-b039-29f492fe4232', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c182d686-67a7-43f3-b039-29f492fe4232', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5364bf2edeaf41ae9d4248e56b5ec33d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9911cb6-21a7-4062-a48f-50af80ca1cac, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c148b4ef-b64b-4c05-9ac4-6ee60efbc89b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:17:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:48.032 137259 INFO neutron.agent.ovn.metadata.agent [-] Port c148b4ef-b64b-4c05-9ac4-6ee60efbc89b in datapath c182d686-67a7-43f3-b039-29f492fe4232 unbound from our chassis#033[00m Dec 6 05:17:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:48.034 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c182d686-67a7-43f3-b039-29f492fe4232, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:17:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:48.036 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a65087c7-be45-49c3-a7f2-f381f4eac2f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:17:48 localhost nova_compute[237281]: 2025-12-06 10:17:48.046 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:48 localhost nova_compute[237281]: 2025-12-06 10:17:48.803 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:48 localhost dnsmasq[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/addn_hosts - 0 addresses Dec 6 05:17:48 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/host Dec 6 05:17:48 localhost dnsmasq-dhcp[247565]: read /var/lib/neutron/dhcp/a4529faa-7a12-4a4a-8c72-70fbd8ccd1af/opts Dec 6 05:17:48 localhost podman[247743]: 2025-12-06 10:17:48.821817143 +0000 UTC m=+0.065514103 container kill 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:17:48 localhost nova_compute[237281]: 2025-12-06 10:17:48.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:49 localhost nova_compute[237281]: 2025-12-06 10:17:49.036 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:49 localhost kernel: device tap9a858631-01 left promiscuous mode Dec 6 05:17:49 localhost ovn_controller[131684]: 2025-12-06T10:17:49Z|00133|binding|INFO|Releasing lport 9a858631-0135-4c4b-afc8-daa9607df776 from this chassis (sb_readonly=0) Dec 6 05:17:49 localhost ovn_controller[131684]: 2025-12-06T10:17:49Z|00134|binding|INFO|Setting lport 9a858631-0135-4c4b-afc8-daa9607df776 down in Southbound Dec 6 05:17:49 localhost nova_compute[237281]: 2025-12-06 10:17:49.059 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:49.237 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2aeed871fc4143d987afbd49f963ad0b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03e4fdb9-16ae-470e-934f-3dd544200d9a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9a858631-0135-4c4b-afc8-daa9607df776) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:17:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:49.238 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 9a858631-0135-4c4b-afc8-daa9607df776 in datapath a4529faa-7a12-4a4a-8c72-70fbd8ccd1af unbound from our chassis#033[00m Dec 6 05:17:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:49.239 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:17:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:49.240 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c68d9591-3803-4668-88cd-253f1c49d5f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:17:50 localhost nova_compute[237281]: 2025-12-06 10:17:50.062 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:50 localhost nova_compute[237281]: 2025-12-06 10:17:50.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:50 localhost nova_compute[237281]: 2025-12-06 10:17:50.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:17:50 localhost nova_compute[237281]: 2025-12-06 10:17:50.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:17:51 localhost ovn_controller[131684]: 2025-12-06T10:17:51Z|00135|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:51 localhost nova_compute[237281]: 2025-12-06 10:17:51.047 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:51 localhost nova_compute[237281]: 2025-12-06 10:17:51.343 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:17:51 localhost nova_compute[237281]: 2025-12-06 10:17:51.343 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:17:51 localhost nova_compute[237281]: 2025-12-06 10:17:51.344 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:17:51 localhost nova_compute[237281]: 2025-12-06 10:17:51.344 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:17:51 localhost podman[247781]: 2025-12-06 10:17:51.69040228 +0000 UTC m=+0.052309535 container kill 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:17:51 localhost dnsmasq[247354]: exiting on receipt of SIGTERM Dec 6 05:17:51 localhost systemd[1]: libpod-27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294.scope: Deactivated successfully. Dec 6 05:17:51 localhost podman[247793]: 2025-12-06 10:17:51.750492303 +0000 UTC m=+0.047942890 container died 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:17:51 localhost systemd[1]: tmp-crun.z9uags.mount: Deactivated successfully. Dec 6 05:17:51 localhost podman[247793]: 2025-12-06 10:17:51.858499846 +0000 UTC m=+0.155950373 container cleanup 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:17:51 localhost systemd[1]: libpod-conmon-27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294.scope: Deactivated successfully. Dec 6 05:17:51 localhost podman[247800]: 2025-12-06 10:17:51.887234052 +0000 UTC m=+0.166630411 container remove 27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c182d686-67a7-43f3-b039-29f492fe4232, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:17:52 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:52.089 219384 INFO neutron.agent.dhcp.agent [None req-8f2b27fe-8622-4ba6-80ef-93e7a2e5d98c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:52 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:52.090 219384 INFO neutron.agent.dhcp.agent [None req-8f2b27fe-8622-4ba6-80ef-93e7a2e5d98c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:52 localhost ovn_controller[131684]: 2025-12-06T10:17:52Z|00136|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:17:52 localhost nova_compute[237281]: 2025-12-06 10:17:52.220 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:52 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:52.276 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:17:52.370 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:17:52 localhost systemd[1]: var-lib-containers-storage-overlay-71b1b7f0c34752d12631a3103e7222b2bfc56f6a3e52326353332e818d38381a-merged.mount: Deactivated successfully. Dec 6 05:17:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27587b26767ff8261f267e2a7204b1ae26d2867eafe432203bcd43091ab3c294-userdata-shm.mount: Deactivated successfully. Dec 6 05:17:52 localhost systemd[1]: run-netns-qdhcp\x2dc182d686\x2d67a7\x2d43f3\x2db039\x2d29f492fe4232.mount: Deactivated successfully. Dec 6 05:17:53 localhost podman[197801]: time="2025-12-06T10:17:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:17:53 localhost nova_compute[237281]: 2025-12-06 10:17:53.321 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:53 localhost podman[197801]: @ - - [06/Dec/2025:10:17:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 145974 "" "Go-http-client/1.1" Dec 6 05:17:53 localhost podman[197801]: @ - - [06/Dec/2025:10:17:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16436 "" "Go-http-client/1.1" Dec 6 05:17:53 localhost systemd[1]: tmp-crun.vUTnzo.mount: Deactivated successfully. Dec 6 05:17:53 localhost dnsmasq[247565]: exiting on receipt of SIGTERM Dec 6 05:17:53 localhost podman[247838]: 2025-12-06 10:17:53.401098345 +0000 UTC m=+0.068858925 container kill 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:17:53 localhost systemd[1]: libpod-2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226.scope: Deactivated successfully. Dec 6 05:17:53 localhost podman[247852]: 2025-12-06 10:17:53.468138183 +0000 UTC m=+0.052786449 container died 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:17:53 localhost podman[247852]: 2025-12-06 10:17:53.498671535 +0000 UTC m=+0.083319791 container cleanup 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:17:53 localhost systemd[1]: libpod-conmon-2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226.scope: Deactivated successfully. Dec 6 05:17:53 localhost podman[247854]: 2025-12-06 10:17:53.553934331 +0000 UTC m=+0.130860369 container remove 2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a4529faa-7a12-4a4a-8c72-70fbd8ccd1af, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:17:53 localhost systemd[1]: var-lib-containers-storage-overlay-49ca37e8ee1fddec698c72233c2fea1c25ae9ef5d9e8ec35857481ac9208be9c-merged.mount: Deactivated successfully. Dec 6 05:17:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2435bf15b084a2b41a2d1df4c674f08c987705ebb656431c1a6707827e086226-userdata-shm.mount: Deactivated successfully. Dec 6 05:17:53 localhost nova_compute[237281]: 2025-12-06 10:17:53.805 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:53 localhost systemd[1]: run-netns-qdhcp\x2da4529faa\x2d7a12\x2d4a4a\x2d8c72\x2d70fbd8ccd1af.mount: Deactivated successfully. Dec 6 05:17:53 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:53.889 219384 INFO neutron.agent.dhcp.agent [None req-ffe9ed7e-89d0-45a4-8f03-1858fc9814b6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:53 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:53.938 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5748 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=121858893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9445A0000000001030307) Dec 6 05:17:54 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:17:54.554 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:17:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5749 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=121858893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD948470000000001030307) Dec 6 05:17:55 localhost nova_compute[237281]: 2025-12-06 10:17:55.066 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:17:55 localhost podman[247883]: 2025-12-06 10:17:55.549652619 +0000 UTC m=+0.082883207 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:17:55 localhost podman[247883]: 2025-12-06 10:17:55.585956369 +0000 UTC m=+0.119186937 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:17:55 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:17:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2209 DF PROTO=TCP SPT=56118 DPT=9102 SEQ=229835423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD94B870000000001030307) Dec 6 05:17:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5750 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=121858893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD950480000000001030307) Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.532 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.557 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.557 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.558 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.559 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.592 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.592 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.593 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.593 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.663 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.738 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.740 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.821 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.822 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:17:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61223 DF PROTO=TCP SPT=53090 DPT=9102 SEQ=3782954089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD953870000000001030307) Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.901 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.902 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:17:57 localhost nova_compute[237281]: 2025-12-06 10:17:57.998 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.018 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.245 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.247 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12309MB free_disk=387.2666816711426GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.247 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.248 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.375 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.375 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.376 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.473 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.496 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.499 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.499 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:17:58 localhost nova_compute[237281]: 2025-12-06 10:17:58.808 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:00 localhost nova_compute[237281]: 2025-12-06 10:18:00.072 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:18:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:18:00 localhost systemd[1]: tmp-crun.dCYIp9.mount: Deactivated successfully. Dec 6 05:18:00 localhost podman[247921]: 2025-12-06 10:18:00.536408093 +0000 UTC m=+0.070875387 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:18:00 localhost podman[247922]: 2025-12-06 10:18:00.589263274 +0000 UTC m=+0.116543796 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:18:00 localhost podman[247921]: 2025-12-06 10:18:00.618413034 +0000 UTC m=+0.152880258 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:18:00 localhost podman[247922]: 2025-12-06 10:18:00.626141481 +0000 UTC m=+0.153421983 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:18:00 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:18:00 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:18:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5751 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=121858893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD960080000000001030307) Dec 6 05:18:03 localhost nova_compute[237281]: 2025-12-06 10:18:03.494 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:03 localhost nova_compute[237281]: 2025-12-06 10:18:03.811 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:05 localhost nova_compute[237281]: 2025-12-06 10:18:05.074 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:18:05 localhost podman[247962]: 2025-12-06 10:18:05.549455239 +0000 UTC m=+0.085438898 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:18:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:18:05 localhost podman[247962]: 2025-12-06 10:18:05.590271047 +0000 UTC m=+0.126254646 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Dec 6 05:18:05 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:18:05 localhost podman[247981]: 2025-12-06 10:18:05.659981448 +0000 UTC m=+0.087431228 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:18:05 localhost podman[247981]: 2025-12-06 10:18:05.671028259 +0000 UTC m=+0.098478069 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm) Dec 6 05:18:05 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:18:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:06.701 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:18:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:06.702 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:18:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:06.702 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:18:08 localhost ovn_controller[131684]: 2025-12-06T10:18:08Z|00137|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:18:08 localhost nova_compute[237281]: 2025-12-06 10:18:08.156 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:08 localhost nova_compute[237281]: 2025-12-06 10:18:08.848 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5752 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=121858893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD97F870000000001030307) Dec 6 05:18:10 localhost nova_compute[237281]: 2025-12-06 10:18:10.113 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:12 localhost nova_compute[237281]: 2025-12-06 10:18:12.176 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:12.336 219384 INFO neutron.agent.linux.ip_lib [None req-230f053f-8bf9-4952-a3bd-140f14d7c4e8 - - - - - -] Device tap81614944-a2 cannot be used as it has no MAC address#033[00m Dec 6 05:18:12 localhost nova_compute[237281]: 2025-12-06 10:18:12.395 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:12 localhost kernel: device tap81614944-a2 entered promiscuous mode Dec 6 05:18:12 localhost NetworkManager[5965]: [1765016292.4035] manager: (tap81614944-a2): new Generic device (/org/freedesktop/NetworkManager/Devices/26) Dec 6 05:18:12 localhost ovn_controller[131684]: 2025-12-06T10:18:12Z|00138|binding|INFO|Claiming lport 81614944-a224-4d80-af18-fc1aee179a23 for this chassis. Dec 6 05:18:12 localhost systemd-udevd[248011]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:18:12 localhost ovn_controller[131684]: 2025-12-06T10:18:12Z|00139|binding|INFO|81614944-a224-4d80-af18-fc1aee179a23: Claiming unknown Dec 6 05:18:12 localhost nova_compute[237281]: 2025-12-06 10:18:12.409 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:12.431 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-66935023-2bb2-4d87-b381-e62adccf722d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66935023-2bb2-4d87-b381-e62adccf722d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '169fe506cf89438283e67a349e3ea2c0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eefdd33d-9d3c-46e0-8cb0-b395b47b3dd4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81614944-a224-4d80-af18-fc1aee179a23) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:18:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:12.433 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 81614944-a224-4d80-af18-fc1aee179a23 in datapath 66935023-2bb2-4d87-b381-e62adccf722d bound to our chassis#033[00m Dec 6 05:18:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:12.435 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port dc60c469-2d35-488e-a9e4-e07c6a9f5087 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:18:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:12.436 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66935023-2bb2-4d87-b381-e62adccf722d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:18:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:12.437 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2badd10c-63ef-470f-b7a7-1f3be81826d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:18:12 localhost ovn_controller[131684]: 2025-12-06T10:18:12Z|00140|binding|INFO|Setting lport 81614944-a224-4d80-af18-fc1aee179a23 ovn-installed in OVS Dec 6 05:18:12 localhost nova_compute[237281]: 2025-12-06 10:18:12.447 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:12 localhost ovn_controller[131684]: 2025-12-06T10:18:12Z|00141|binding|INFO|Setting lport 81614944-a224-4d80-af18-fc1aee179a23 up in Southbound Dec 6 05:18:12 localhost nova_compute[237281]: 2025-12-06 10:18:12.494 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:12 localhost nova_compute[237281]: 2025-12-06 10:18:12.522 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:13 localhost podman[248066]: Dec 6 05:18:13 localhost podman[248066]: 2025-12-06 10:18:13.584605266 +0000 UTC m=+0.097043624 container create e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:18:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:18:13 localhost systemd[1]: Started libpod-conmon-e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877.scope. Dec 6 05:18:13 localhost systemd[1]: Started libcrun container. Dec 6 05:18:13 localhost podman[248066]: 2025-12-06 10:18:13.536711429 +0000 UTC m=+0.049149797 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:18:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e20314b4ebfe7046d99066413bfcee3a1c528013f7e5822d8913edf496b8f646/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:18:13 localhost podman[248066]: 2025-12-06 10:18:13.651215531 +0000 UTC m=+0.163653870 container init e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:18:13 localhost podman[248066]: 2025-12-06 10:18:13.663541321 +0000 UTC m=+0.175979669 container start e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:18:13 localhost dnsmasq[248094]: started, version 2.85 cachesize 150 Dec 6 05:18:13 localhost dnsmasq[248094]: DNS service limited to local subnets Dec 6 05:18:13 localhost dnsmasq[248094]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:18:13 localhost dnsmasq[248094]: warning: no upstream servers configured Dec 6 05:18:13 localhost dnsmasq-dhcp[248094]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:18:13 localhost dnsmasq[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/addn_hosts - 0 addresses Dec 6 05:18:13 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/host Dec 6 05:18:13 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/opts Dec 6 05:18:13 localhost podman[248079]: 2025-12-06 10:18:13.711061047 +0000 UTC m=+0.091919336 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350) Dec 6 05:18:13 localhost podman[248079]: 2025-12-06 10:18:13.721354395 +0000 UTC m=+0.102212654 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9) Dec 6 05:18:13 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:18:13 localhost nova_compute[237281]: 2025-12-06 10:18:13.887 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:14.093 219384 INFO neutron.agent.dhcp.agent [None req-42853930-d75f-4850-8c02-4a301fef49dd - - - - - -] DHCP configuration for ports {'3555388d-d8e1-4a56-82d8-13d079ac92f7'} is completed#033[00m Dec 6 05:18:14 localhost systemd[1]: tmp-crun.ybHtLA.mount: Deactivated successfully. Dec 6 05:18:15 localhost nova_compute[237281]: 2025-12-06 10:18:15.142 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:16 localhost openstack_network_exporter[199751]: ERROR 10:18:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:18:16 localhost openstack_network_exporter[199751]: ERROR 10:18:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:18:16 localhost openstack_network_exporter[199751]: ERROR 10:18:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:18:16 localhost openstack_network_exporter[199751]: ERROR 10:18:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:18:16 localhost openstack_network_exporter[199751]: Dec 6 05:18:16 localhost openstack_network_exporter[199751]: ERROR 10:18:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:18:16 localhost openstack_network_exporter[199751]: Dec 6 05:18:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:18:17 localhost podman[248102]: 2025-12-06 10:18:17.54182133 +0000 UTC m=+0.077512303 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:18:17 localhost podman[248102]: 2025-12-06 10:18:17.553265113 +0000 UTC m=+0.088956056 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:18:17 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:18:18 localhost nova_compute[237281]: 2025-12-06 10:18:18.657 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:18 localhost nova_compute[237281]: 2025-12-06 10:18:18.889 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:20 localhost nova_compute[237281]: 2025-12-06 10:18:20.146 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:22.891 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:21Z, description=, device_id=6fcdfa2f-3f44-46c1-9780-f9fe1dca6b53, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=18b1e093-9423-406a-822d-2753b562f6ca, ip_allocation=immediate, mac_address=fa:16:3e:f4:f2:fe, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:07Z, description=, dns_domain=, id=66935023-2bb2-4d87-b381-e62adccf722d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1813119206-network, port_security_enabled=True, project_id=169fe506cf89438283e67a349e3ea2c0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39769, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=813, status=ACTIVE, subnets=['c837de71-b07a-41e0-a234-dc1e8d6911f8'], tags=[], tenant_id=169fe506cf89438283e67a349e3ea2c0, updated_at=2025-12-06T10:18:09Z, vlan_transparent=None, network_id=66935023-2bb2-4d87-b381-e62adccf722d, port_security_enabled=False, project_id=169fe506cf89438283e67a349e3ea2c0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=830, status=DOWN, tags=[], tenant_id=169fe506cf89438283e67a349e3ea2c0, updated_at=2025-12-06T10:18:22Z on network 66935023-2bb2-4d87-b381-e62adccf722d#033[00m Dec 6 05:18:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:22.992 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:18:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:22.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.033 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.034 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c21985e7-daee-4da2-bf12-b15cf539ee0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:22.993921', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e527f45c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '87e81ca2db8c4835c45c78a0d6c8e9734ada482160acff287bb8644a705e3893'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:22.993921', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5280898-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': 'c3601cab072eed9ea4cb3790a9ec81d4b2c72872ce00e3f86e6af9a4371d1726'}]}, 'timestamp': '2025-12-06 10:18:23.034830', '_unique_id': 'ca74a0e69745499d97212cf45b98db68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.036 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.037 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.054 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 18030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50faabf4-c711-48e4-a1b4-253d76c6e3e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18030000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:18:23.037816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e52b184e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.264322912, 'message_signature': 'f8ff4297644d8bfa3a480afff799b47b2d036223288e068fad34ffd6ef4836d9'}]}, 'timestamp': '2025-12-06 10:18:23.054965', '_unique_id': 'b24c5518b45448299eb93217ed1a12b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.060 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb14860b-5f55-4a83-a541-a70f21c1c65d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.057654', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e52c0e5c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '545e3009d4684bb93ce77a327be217d6e7a19c3340247a23f99bea7a5669f4f7'}]}, 'timestamp': '2025-12-06 10:18:23.061201', '_unique_id': 'ca98351e56064ef1a907fd3d5a463012'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dae692e6-593b-446f-a0cb-7e5075a24f94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.063617', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e52c817a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '1fa9e42ff320334946d4cfe39b5583cacb052638da3ca9d8174b437517328c3c'}]}, 'timestamp': '2025-12-06 10:18:23.064140', '_unique_id': '7dcc608acfc44584bb92a1b7db39d6ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.066 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.066 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be427e3b-538c-4ce7-a8a4-4c4734640d70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.066412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e52cee1c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '3c7feb52c3664be499d9c3287d11e17ef254f5bfee9a4845f14b2432e7b424c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.066412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e52d0014-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': 'f44ca00076890daf8c34596b51e2b8a7ad4e38cc6be8e2efcb0369b037f6d214'}]}, 'timestamp': '2025-12-06 10:18:23.067359', '_unique_id': '5f5ffc050b0245dcbada7d83ad8ff4e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.069 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.069 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8b44fe3-0eec-4af1-af20-d57f25325366', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.069824', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e52d7350-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '2e15a8c777cc0200150a8f6881c5b74fe1699571396b914d4c88446c705281d4'}]}, 'timestamp': '2025-12-06 10:18:23.070321', '_unique_id': 'd09624117df343558bf99614fa9b4f43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.072 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.072 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15f4c3e4-7b02-4152-a54a-951dd7ce6d51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.072465', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e52dda16-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '9d00d006a218f1fa19f672e836202e4d78083d586830ed908c750741db0ff500'}]}, 'timestamp': '2025-12-06 10:18:23.073003', '_unique_id': '903f34ccc71b45fa88b3d2d06d7333b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05066829-31d1-4ebc-b7c6-18af5bb93ed4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.075152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5308d6a-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.285268518, 'message_signature': '4f6fad7572198472e2c74b5080bc80d6d50cef1a4ca4317aeea0aa3bea5c25b7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.075152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e530a0d4-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.285268518, 'message_signature': '148969579ef24bfbc717c82ad3f9e10396a35d7a0b4ba7c5648ca0cb1ad39b2f'}]}, 'timestamp': '2025-12-06 10:18:23.091130', '_unique_id': '8d8e6fb1938c44ecbfd9871144874d0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.092 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.093 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.094 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2462139-d126-42f5-b2f5-e2101e0e8788', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.093783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5311be0-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.285268518, 'message_signature': '3e57a702b553e69c3c4e7c6dcc854c90addf90aeda9acc177887dd26a2eb1cc4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.093783', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5312bf8-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.285268518, 'message_signature': '94e02b8aaec70d22a2aa760b72f88800ec707e49a8e32ffcec3e265ac8580cd2'}]}, 'timestamp': '2025-12-06 10:18:23.094786', '_unique_id': '01bd4dc916104ed6b98c9aa701eb89dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.096 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9791ba-4149-45a0-b04c-116d43be93c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:18:23.097106', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e5319f02-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.264322912, 'message_signature': 'ba5b6cb46658cd4cab1df3633d0f2afb672c86ccf379471682e9223d923a7844'}]}, 'timestamp': '2025-12-06 10:18:23.097636', '_unique_id': 'd5398dc0e113435a8fe44eeef1fe53a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b28821a4-637d-4389-8001-18baa759fb89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.100129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e53213b0-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '00bd2230a2216d2001fce2eb5d4a9ddd2e2fb1a4ae306ef73d445343f42b8cc5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.100129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5322800-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': 'c4e4b05dc3a877681d31407f07474a94d3c1194c6861d63127aad3d72e25144a'}]}, 'timestamp': '2025-12-06 10:18:23.101140', '_unique_id': 'ca40e3df8e7343ccb667ea7553daa555'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.102 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '506e6698-cc8f-4d24-a23e-7729cf975fec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.103557', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e53297d6-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '77f63e8a2e9d19e99d834d598c65710fc9ea72d67224d85cb2c34b12dda3b920'}]}, 'timestamp': '2025-12-06 10:18:23.104052', '_unique_id': 'e77bdeda607943c5af405ec9327f3365'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af412ae4-5f52-4817-9cd7-8264d7128c7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.106478', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e533098c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '3d6b58c35efb6a514b0e7791d01e84c4250c749558667d9523b3573d886816e5'}]}, 'timestamp': '2025-12-06 10:18:23.106983', '_unique_id': '37bcdddea89e4e0497ca179959f13a90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee86523e-b72d-4db4-a3ec-9560742f9014', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.109306', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e5337854-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '109b1ff17b79b7bee84582d112d42a13ce9bd80e549f5d18a9a029a7f0791199'}]}, 'timestamp': '2025-12-06 10:18:23.109769', '_unique_id': '62a89c7190c94f789f9226c8510cb8c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d4680dc-ff6f-4145-8219-1e547f317858', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.112036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e533e262-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '341ad79bb34f28584cea19dbb6d6f6f67fdd1b9ead0372a8cffe69bc158e804f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.112036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e533f270-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '08cc2cee7aaf239ea0a8fb06dfcc2f5c9d36725d8c1a2717c1a1e8512c55be36'}]}, 'timestamp': '2025-12-06 10:18:23.112936', '_unique_id': '9aba5c863ad24f14b9ed21b10b5ba8be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.113 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b866934-f931-4ec4-b87d-6ea517b3de97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.115093', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e5345a12-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': 'e3c300793ec176f618f169872f11d9125c969ca73e77cd161b3692eced8aea2c'}]}, 'timestamp': '2025-12-06 10:18:23.115550', '_unique_id': 'c283696a58764ee5ad0a737411b289b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ab3fff3-edf1-4101-8a54-340312253796', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.118017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e534cc0e-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '37568c3bb575b1b76070142f450cd62a8a8cfe69258653a565064504b52d2552'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.118017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e534dc44-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '12c3a18d73844ecdce57ebad3cf51ac24af5323874e1fbd8ac5e9e722b7db450'}]}, 'timestamp': '2025-12-06 10:18:23.118907', '_unique_id': 'e6b670a42bfd4cb7af0f6024fda925ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.121 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a056f526-a1c5-464f-8a7c-e47bd05f3fd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.121219', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e5354990-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '984690fff199c08a2fc0f4c01a228c1638e368ee5b47c1cce1777b6eff6d7422'}]}, 'timestamp': '2025-12-06 10:18:23.121703', '_unique_id': '979b1b7767c84291bfa28c5d997870b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.122 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e948e26-fd7f-4a23-b105-b4b0db58a1b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:18:23.123419', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'e5359c92-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.267769128, 'message_signature': '22ae13a72f97a9dfb86ba0a8491e03e4aee5f41f0d07c8c7911db25303959ad2'}]}, 'timestamp': '2025-12-06 10:18:23.123731', '_unique_id': 'd3f424333f8042cd8a0139dad771c538'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4950ec27-e6d1-411e-b440-2f8106eeeb62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.125201', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e535e1fc-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': '155f0ec0ec964ba10ad3ab2b6946a8d6e513e7226963a99016c9ed3d8c6c214d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.125201', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e535ecec-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.204038052, 'message_signature': 'ca02e4a3405458c6f5b81293751635c4eb9e72b421423ff2e26492405d1d5e70'}]}, 'timestamp': '2025-12-06 10:18:23.125767', '_unique_id': '723f3d9034f04dcc88d9656c656cbbd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.127 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.127 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbb6d6c5-2002-44d4-92ee-2f11eea73f50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:18:23.127233', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e536315c-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.285268518, 'message_signature': '0b1130de80391c775a83788b22300212165424a9d5313447cd634909e6633e99'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:18:23.127233', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5363c06-d28c-11f0-8fed-fa163edf398d', 'monotonic_time': 12690.285268518, 'message_signature': '709332b6abe9cc1efbddbe26769e27e7c0501064a0f81a2c8a9a45dc4c51fd63'}]}, 'timestamp': '2025-12-06 10:18:23.127789', '_unique_id': 'e7cba9caca86440bac1cf070e93b3cc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:18:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:18:23.128 12 ERROR oslo_messaging.notify.messaging Dec 6 05:18:23 localhost systemd[1]: tmp-crun.15NzvY.mount: Deactivated successfully. Dec 6 05:18:23 localhost dnsmasq[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/addn_hosts - 1 addresses Dec 6 05:18:23 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/host Dec 6 05:18:23 localhost podman[248142]: 2025-12-06 10:18:23.190305709 +0000 UTC m=+0.074298123 container kill e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:18:23 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/opts Dec 6 05:18:23 localhost podman[197801]: time="2025-12-06T10:18:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:18:23 localhost podman[197801]: @ - - [06/Dec/2025:10:18:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 145974 "" "Go-http-client/1.1" Dec 6 05:18:23 localhost podman[197801]: @ - - [06/Dec/2025:10:18:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16438 "" "Go-http-client/1.1" Dec 6 05:18:23 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:23.437 219384 INFO neutron.agent.dhcp.agent [None req-b848874a-3dc7-4ebc-b9c6-ec3c9b721f15 - - - - - -] DHCP configuration for ports {'18b1e093-9423-406a-822d-2753b562f6ca'} is completed#033[00m Dec 6 05:18:23 localhost nova_compute[237281]: 2025-12-06 10:18:23.919 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:23 localhost nova_compute[237281]: 2025-12-06 10:18:23.997 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62298 DF PROTO=TCP SPT=55600 DPT=9102 SEQ=958516846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9B9890000000001030307) Dec 6 05:18:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62299 DF PROTO=TCP SPT=55600 DPT=9102 SEQ=958516846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9BD870000000001030307) Dec 6 05:18:25 localhost nova_compute[237281]: 2025-12-06 10:18:25.180 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5753 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=121858893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9BF880000000001030307) Dec 6 05:18:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:26.287 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:21Z, description=, device_id=6fcdfa2f-3f44-46c1-9780-f9fe1dca6b53, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=18b1e093-9423-406a-822d-2753b562f6ca, ip_allocation=immediate, mac_address=fa:16:3e:f4:f2:fe, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:07Z, description=, dns_domain=, id=66935023-2bb2-4d87-b381-e62adccf722d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1813119206-network, port_security_enabled=True, project_id=169fe506cf89438283e67a349e3ea2c0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39769, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=813, status=ACTIVE, subnets=['c837de71-b07a-41e0-a234-dc1e8d6911f8'], tags=[], tenant_id=169fe506cf89438283e67a349e3ea2c0, updated_at=2025-12-06T10:18:09Z, vlan_transparent=None, network_id=66935023-2bb2-4d87-b381-e62adccf722d, port_security_enabled=False, project_id=169fe506cf89438283e67a349e3ea2c0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=830, status=DOWN, tags=[], tenant_id=169fe506cf89438283e67a349e3ea2c0, updated_at=2025-12-06T10:18:22Z on network 66935023-2bb2-4d87-b381-e62adccf722d#033[00m Dec 6 05:18:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:18:26 localhost podman[248190]: 2025-12-06 10:18:26.575615308 +0000 UTC m=+0.056457594 container kill e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:18:26 localhost dnsmasq[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/addn_hosts - 1 addresses Dec 6 05:18:26 localhost systemd[1]: tmp-crun.9pRUms.mount: Deactivated successfully. Dec 6 05:18:26 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/host Dec 6 05:18:26 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/opts Dec 6 05:18:26 localhost podman[248175]: 2025-12-06 10:18:26.562217424 +0000 UTC m=+0.089164942 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:18:26 localhost podman[248175]: 2025-12-06 10:18:26.646477754 +0000 UTC m=+0.173425262 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Dec 6 05:18:26 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:18:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62300 DF PROTO=TCP SPT=55600 DPT=9102 SEQ=958516846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9C5880000000001030307) Dec 6 05:18:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:27.183 219384 INFO neutron.agent.dhcp.agent [None req-e8d4e4fb-d7fc-4188-9465-fc8c8de1c50a - - - - - -] DHCP configuration for ports {'18b1e093-9423-406a-822d-2753b562f6ca'} is completed#033[00m Dec 6 05:18:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2210 DF PROTO=TCP SPT=56118 DPT=9102 SEQ=229835423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9C9880000000001030307) Dec 6 05:18:28 localhost nova_compute[237281]: 2025-12-06 10:18:28.924 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:30 localhost nova_compute[237281]: 2025-12-06 10:18:30.232 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:30 localhost neutron_sriov_agent[212548]: 2025-12-06 10:18:30.914 2 INFO neutron.agent.securitygroups_rpc [None req-4aef20a1-e167-473b-b96a-1d6d17931325 de34ca65371d4e6a903edd70cb8e9c20 bebfb6087b6b4b7aa357cc1b142247ff - - default default] Security group rule updated ['78ae70f9-ce5b-485f-8f5c-b606f7261a23']#033[00m Dec 6 05:18:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62301 DF PROTO=TCP SPT=55600 DPT=9102 SEQ=958516846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9D5470000000001030307) Dec 6 05:18:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:18:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:18:31 localhost podman[248227]: 2025-12-06 10:18:31.550984809 +0000 UTC m=+0.082831367 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:18:31 localhost podman[248227]: 2025-12-06 10:18:31.562914106 +0000 UTC m=+0.094760664 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:18:31 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:18:31 localhost podman[248226]: 2025-12-06 10:18:31.658630219 +0000 UTC m=+0.194052307 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:18:31 localhost podman[248226]: 2025-12-06 10:18:31.664937083 +0000 UTC m=+0.200359151 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:18:31 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:18:32 localhost neutron_sriov_agent[212548]: 2025-12-06 10:18:32.060 2 INFO neutron.agent.securitygroups_rpc [None req-ad01dfa2-7546-4f58-8b05-3445a453a5c9 de34ca65371d4e6a903edd70cb8e9c20 bebfb6087b6b4b7aa357cc1b142247ff - - default default] Security group rule updated ['78ae70f9-ce5b-485f-8f5c-b606f7261a23']#033[00m Dec 6 05:18:33 localhost nova_compute[237281]: 2025-12-06 10:18:33.952 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:35 localhost nova_compute[237281]: 2025-12-06 10:18:35.269 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:35 localhost nova_compute[237281]: 2025-12-06 10:18:35.944 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:18:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:18:36 localhost dnsmasq[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/addn_hosts - 0 addresses Dec 6 05:18:36 localhost podman[248284]: 2025-12-06 10:18:36.495393617 +0000 UTC m=+0.055366040 container kill e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:18:36 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/host Dec 6 05:18:36 localhost dnsmasq-dhcp[248094]: read /var/lib/neutron/dhcp/66935023-2bb2-4d87-b381-e62adccf722d/opts Dec 6 05:18:36 localhost podman[248294]: 2025-12-06 10:18:36.552744886 +0000 UTC m=+0.080598728 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Dec 6 05:18:36 localhost podman[248294]: 2025-12-06 10:18:36.56324409 +0000 UTC m=+0.091097972 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent) Dec 6 05:18:36 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:18:36 localhost podman[248296]: 2025-12-06 10:18:36.618023529 +0000 UTC m=+0.142591330 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:18:36 localhost podman[248296]: 2025-12-06 10:18:36.654309769 +0000 UTC m=+0.178877640 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:18:36 localhost ovn_controller[131684]: 2025-12-06T10:18:36Z|00142|binding|INFO|Releasing lport 81614944-a224-4d80-af18-fc1aee179a23 from this chassis (sb_readonly=0) Dec 6 05:18:36 localhost ovn_controller[131684]: 2025-12-06T10:18:36Z|00143|binding|INFO|Setting lport 81614944-a224-4d80-af18-fc1aee179a23 down in Southbound Dec 6 05:18:36 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:18:36 localhost nova_compute[237281]: 2025-12-06 10:18:36.704 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:36 localhost kernel: device tap81614944-a2 left promiscuous mode Dec 6 05:18:36 localhost nova_compute[237281]: 2025-12-06 10:18:36.712 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:36.715 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-66935023-2bb2-4d87-b381-e62adccf722d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66935023-2bb2-4d87-b381-e62adccf722d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '169fe506cf89438283e67a349e3ea2c0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eefdd33d-9d3c-46e0-8cb0-b395b47b3dd4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81614944-a224-4d80-af18-fc1aee179a23) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:18:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:36.718 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 81614944-a224-4d80-af18-fc1aee179a23 in datapath 66935023-2bb2-4d87-b381-e62adccf722d unbound from our chassis#033[00m Dec 6 05:18:36 localhost nova_compute[237281]: 2025-12-06 10:18:36.719 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:36.722 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66935023-2bb2-4d87-b381-e62adccf722d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:18:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:36.724 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6af48cf9-f3d6-48dc-ba45-86855019d238]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:18:38 localhost nova_compute[237281]: 2025-12-06 10:18:38.955 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62302 DF PROTO=TCP SPT=55600 DPT=9102 SEQ=958516846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DD9F5880000000001030307) Dec 6 05:18:40 localhost nova_compute[237281]: 2025-12-06 10:18:40.311 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:40 localhost ovn_controller[131684]: 2025-12-06T10:18:40Z|00144|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:18:40 localhost nova_compute[237281]: 2025-12-06 10:18:40.388 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:40 localhost nova_compute[237281]: 2025-12-06 10:18:40.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:41 localhost dnsmasq[248094]: exiting on receipt of SIGTERM Dec 6 05:18:41 localhost podman[248360]: 2025-12-06 10:18:41.271640045 +0000 UTC m=+0.065465040 container kill e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:18:41 localhost systemd[1]: libpod-e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877.scope: Deactivated successfully. Dec 6 05:18:41 localhost podman[248376]: 2025-12-06 10:18:41.355660597 +0000 UTC m=+0.059632480 container died e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:18:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877-userdata-shm.mount: Deactivated successfully. Dec 6 05:18:41 localhost systemd[1]: var-lib-containers-storage-overlay-e20314b4ebfe7046d99066413bfcee3a1c528013f7e5822d8913edf496b8f646-merged.mount: Deactivated successfully. Dec 6 05:18:41 localhost podman[248376]: 2025-12-06 10:18:41.406774515 +0000 UTC m=+0.110746358 container remove e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-66935023-2bb2-4d87-b381-e62adccf722d, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:18:41 localhost systemd[1]: libpod-conmon-e0a08d114d737fa8737ef5591312eb601eb613fc7ee6b24d148f54ffa46cf877.scope: Deactivated successfully. Dec 6 05:18:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:41.751 219384 INFO neutron.agent.dhcp.agent [None req-52c557ef-a858-40eb-b166-e8c9ac2e84a5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:18:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:41.793 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:18:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:42.073 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:18:42 localhost nova_compute[237281]: 2025-12-06 10:18:42.156 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:42 localhost systemd[1]: run-netns-qdhcp\x2d66935023\x2d2bb2\x2d4d87\x2db381\x2de62adccf722d.mount: Deactivated successfully. Dec 6 05:18:43 localhost nova_compute[237281]: 2025-12-06 10:18:43.343 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:43 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:43.344 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:18:43 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:43.346 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:18:44 localhost nova_compute[237281]: 2025-12-06 10:18:44.001 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:18:44 localhost systemd[1]: tmp-crun.e9dIWa.mount: Deactivated successfully. Dec 6 05:18:44 localhost podman[248401]: 2025-12-06 10:18:44.575714707 +0000 UTC m=+0.100536483 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Dec 6 05:18:44 localhost podman[248401]: 2025-12-06 10:18:44.619426906 +0000 UTC m=+0.144248692 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 6 05:18:44 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:18:45 localhost nova_compute[237281]: 2025-12-06 10:18:45.360 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:45 localhost nova_compute[237281]: 2025-12-06 10:18:45.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:46 localhost openstack_network_exporter[199751]: ERROR 10:18:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:18:46 localhost openstack_network_exporter[199751]: ERROR 10:18:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:18:46 localhost openstack_network_exporter[199751]: Dec 6 05:18:46 localhost openstack_network_exporter[199751]: ERROR 10:18:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:18:46 localhost openstack_network_exporter[199751]: ERROR 10:18:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:18:46 localhost openstack_network_exporter[199751]: ERROR 10:18:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:18:46 localhost openstack_network_exporter[199751]: Dec 6 05:18:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:46.349 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:18:46 localhost nova_compute[237281]: 2025-12-06 10:18:46.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:46 localhost nova_compute[237281]: 2025-12-06 10:18:46.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:47 localhost neutron_sriov_agent[212548]: 2025-12-06 10:18:47.778 2 INFO neutron.agent.securitygroups_rpc [req-b41e5c6b-8709-4751-83e1-51c9de21808a req-5ff02935-832e-4c90-a730-b0e32344974a de34ca65371d4e6a903edd70cb8e9c20 bebfb6087b6b4b7aa357cc1b142247ff - - default default] Security group member updated ['78ae70f9-ce5b-485f-8f5c-b606f7261a23']#033[00m Dec 6 05:18:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:18:48 localhost systemd[1]: tmp-crun.4rD8Y9.mount: Deactivated successfully. Dec 6 05:18:48 localhost podman[248421]: 2025-12-06 10:18:48.567690131 +0000 UTC m=+0.096457947 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:18:48 localhost podman[248421]: 2025-12-06 10:18:48.604267439 +0000 UTC m=+0.133035215 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:18:48 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:18:48 localhost nova_compute[237281]: 2025-12-06 10:18:48.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:48 localhost nova_compute[237281]: 2025-12-06 10:18:48.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:18:49 localhost nova_compute[237281]: 2025-12-06 10:18:49.004 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:49 localhost nova_compute[237281]: 2025-12-06 10:18:49.888 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:50 localhost nova_compute[237281]: 2025-12-06 10:18:50.362 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:51 localhost nova_compute[237281]: 2025-12-06 10:18:51.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:18:51 localhost nova_compute[237281]: 2025-12-06 10:18:51.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:18:51 localhost nova_compute[237281]: 2025-12-06 10:18:51.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.037 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.038 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.060 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.214 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.215 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.219 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.220 237285 INFO nova.compute.claims [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Claim successful on node np0005548798.ooo.test#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.400 237285 DEBUG nova.compute.provider_tree [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.428 237285 DEBUG nova.scheduler.client.report [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.456 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.457 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.516 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.517 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.541 237285 INFO nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.570 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.686 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.688 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.689 237285 INFO nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Creating image(s)#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.690 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "/var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.690 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "/var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.691 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "/var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.714 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.788 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.789 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.790 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.804 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.880 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.881 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3,backing_fmt=raw /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.916 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3,backing_fmt=raw /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.917 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "7ed36996b83444bfa83969c1e5caf9794500f5d3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.917 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.970 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.971 237285 DEBUG nova.virt.disk.api [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Checking if we can resize image /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m Dec 6 05:18:52 localhost nova_compute[237281]: 2025-12-06 10:18:52.972 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:18:53 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:53.004 219384 INFO neutron.agent.linux.ip_lib [None req-d6566716-fd63-4e11-891a-471bbb5c29a4 - - - - - -] Device tap2d0b99fc-df cannot be used as it has no MAC address#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.022 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.024 237285 DEBUG nova.virt.disk.api [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Cannot resize image /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.024 237285 DEBUG nova.objects.instance [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lazy-loading 'migration_context' on Instance uuid a762ea39-4184-4fba-8fd6-e4390fdf75fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.026 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:53 localhost kernel: device tap2d0b99fc-df entered promiscuous mode Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.033 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.033 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.034 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.034 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:18:53 localhost NetworkManager[5965]: [1765016333.0354] manager: (tap2d0b99fc-df): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.035 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:53 localhost ovn_controller[131684]: 2025-12-06T10:18:53Z|00145|binding|INFO|Claiming lport 2d0b99fc-df7e-4cc1-959f-29a393d1a20b for this chassis. Dec 6 05:18:53 localhost ovn_controller[131684]: 2025-12-06T10:18:53Z|00146|binding|INFO|2d0b99fc-df7e-4cc1-959f-29a393d1a20b: Claiming unknown Dec 6 05:18:53 localhost systemd-udevd[248467]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:18:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:53.044 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9b525551-ca72-4804-b27f-0c9808ee3709', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b525551-ca72-4804-b27f-0c9808ee3709', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cfaa880c4e9b463d924febc9999ed70c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ced779d2-888c-4e42-9339-20891d4d231d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2d0b99fc-df7e-4cc1-959f-29a393d1a20b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:18:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:53.045 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2d0b99fc-df7e-4cc1-959f-29a393d1a20b in datapath 9b525551-ca72-4804-b27f-0c9808ee3709 bound to our chassis#033[00m Dec 6 05:18:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:53.046 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ceaf4336-8bcf-4349-b0ca-52b4a694f5ca IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:18:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:53.046 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b525551-ca72-4804-b27f-0c9808ee3709, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:18:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:18:53.047 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[271d2ab8-1d58-471b-b626-a2adbcc2addd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.047 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:53 localhost ovn_controller[131684]: 2025-12-06T10:18:53Z|00147|binding|INFO|Setting lport 2d0b99fc-df7e-4cc1-959f-29a393d1a20b ovn-installed in OVS Dec 6 05:18:53 localhost ovn_controller[131684]: 2025-12-06T10:18:53Z|00148|binding|INFO|Setting lport 2d0b99fc-df7e-4cc1-959f-29a393d1a20b up in Southbound Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.052 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.052 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Ensure instance console log exists: /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.053 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.053 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.053 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.054 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.069 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost journal[186952]: ethtool ioctl error on tap2d0b99fc-df: No such device Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.112 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.138 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:53 localhost podman[197801]: time="2025-12-06T10:18:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:18:53 localhost podman[197801]: @ - - [06/Dec/2025:10:18:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:18:53 localhost podman[197801]: @ - - [06/Dec/2025:10:18:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15965 "" "Go-http-client/1.1" Dec 6 05:18:53 localhost nova_compute[237281]: 2025-12-06 10:18:53.968 237285 DEBUG nova.policy [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '407472656c6040578cb2bdc3b4288953', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9def7e7271f8404db90dc0d9d3faf8c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m Dec 6 05:18:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26308 DF PROTO=TCP SPT=41586 DPT=9102 SEQ=2662004881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDA2EBA0000000001030307) Dec 6 05:18:54 localhost nova_compute[237281]: 2025-12-06 10:18:54.049 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:54 localhost podman[248538]: Dec 6 05:18:54 localhost podman[248538]: 2025-12-06 10:18:54.306745304 +0000 UTC m=+0.087959575 container create 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:18:54 localhost systemd[1]: Started libpod-conmon-4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a.scope. Dec 6 05:18:54 localhost podman[248538]: 2025-12-06 10:18:54.263003894 +0000 UTC m=+0.044218205 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:18:54 localhost systemd[1]: Started libcrun container. Dec 6 05:18:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a4b9ec70a121a668f5379c3620a940a44c88c9e2d75299dc6ce16bedf222d64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:18:54 localhost podman[248538]: 2025-12-06 10:18:54.386034289 +0000 UTC m=+0.167248590 container init 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:18:54 localhost podman[248538]: 2025-12-06 10:18:54.400482775 +0000 UTC m=+0.181697056 container start 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:18:54 localhost dnsmasq[248557]: started, version 2.85 cachesize 150 Dec 6 05:18:54 localhost dnsmasq[248557]: DNS service limited to local subnets Dec 6 05:18:54 localhost dnsmasq[248557]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:18:54 localhost dnsmasq[248557]: warning: no upstream servers configured Dec 6 05:18:54 localhost dnsmasq-dhcp[248557]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:18:54 localhost dnsmasq[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/addn_hosts - 0 addresses Dec 6 05:18:54 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/host Dec 6 05:18:54 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/opts Dec 6 05:18:54 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:54.874 219384 INFO neutron.agent.dhcp.agent [None req-2cd0a44c-6f54-4ae3-8309-737f5c24822f - - - - - -] DHCP configuration for ports {'f401fc8b-d74a-4be0-b253-323a245d6df5'} is completed#033[00m Dec 6 05:18:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26309 DF PROTO=TCP SPT=41586 DPT=9102 SEQ=2662004881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDA32C80000000001030307) Dec 6 05:18:55 localhost nova_compute[237281]: 2025-12-06 10:18:55.397 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:55 localhost nova_compute[237281]: 2025-12-06 10:18:55.508 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62303 DF PROTO=TCP SPT=55600 DPT=9102 SEQ=958516846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDA35870000000001030307) Dec 6 05:18:55 localhost neutron_sriov_agent[212548]: 2025-12-06 10:18:55.963 2 INFO neutron.agent.securitygroups_rpc [req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 req-8affb40d-78fd-4c54-834e-5124437efaf7 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Security group member updated ['a3402b43-c129-4c99-9939-dcb84895acdb']#033[00m Dec 6 05:18:56 localhost nova_compute[237281]: 2025-12-06 10:18:56.830 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Successfully created port: 01e51dda-8924-4663-86dd-9d9e200f064e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m Dec 6 05:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26310 DF PROTO=TCP SPT=41586 DPT=9102 SEQ=2662004881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDA3AC70000000001030307) Dec 6 05:18:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:57.297 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:56Z, description=, device_id=6887ed8b-0deb-4633-8817-06371e2d833c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=35035436-4557-4de1-b5f4-bfe7e71862be, ip_allocation=immediate, mac_address=fa:16:3e:e5:78:19, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:50Z, description=, dns_domain=, id=9b525551-ca72-4804-b27f-0c9808ee3709, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1364532810-network, port_security_enabled=True, project_id=cfaa880c4e9b463d924febc9999ed70c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=26566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=898, status=ACTIVE, subnets=['f7a52947-c77e-4aef-90af-87931200bfce'], tags=[], tenant_id=cfaa880c4e9b463d924febc9999ed70c, updated_at=2025-12-06T10:18:51Z, vlan_transparent=None, network_id=9b525551-ca72-4804-b27f-0c9808ee3709, port_security_enabled=False, project_id=cfaa880c4e9b463d924febc9999ed70c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=912, status=DOWN, tags=[], tenant_id=cfaa880c4e9b463d924febc9999ed70c, updated_at=2025-12-06T10:18:57Z on network 9b525551-ca72-4804-b27f-0c9808ee3709#033[00m Dec 6 05:18:57 localhost nova_compute[237281]: 2025-12-06 10:18:57.356 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:18:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:18:57 localhost dnsmasq[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/addn_hosts - 1 addresses Dec 6 05:18:57 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/host Dec 6 05:18:57 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/opts Dec 6 05:18:57 localhost podman[248574]: 2025-12-06 10:18:57.530327911 +0000 UTC m=+0.065000006 container kill 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:18:57 localhost systemd[1]: tmp-crun.OfpauR.mount: Deactivated successfully. Dec 6 05:18:57 localhost systemd[1]: tmp-crun.SSguqk.mount: Deactivated successfully. Dec 6 05:18:57 localhost podman[248575]: 2025-12-06 10:18:57.622931278 +0000 UTC m=+0.148900365 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125) Dec 6 05:18:57 localhost podman[248575]: 2025-12-06 10:18:57.702295847 +0000 UTC m=+0.228264924 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:18:57 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5754 DF PROTO=TCP SPT=47828 DPT=9102 SEQ=121858893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDA3D870000000001030307) Dec 6 05:18:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:57.861 219384 INFO neutron.agent.dhcp.agent [None req-afcee505-303f-4e6c-ab65-1a61206f3607 - - - - - -] DHCP configuration for ports {'35035436-4557-4de1-b5f4-bfe7e71862be'} is completed#033[00m Dec 6 05:18:59 localhost nova_compute[237281]: 2025-12-06 10:18:59.052 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:18:59 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:18:59.569 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:56Z, description=, device_id=6887ed8b-0deb-4633-8817-06371e2d833c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=35035436-4557-4de1-b5f4-bfe7e71862be, ip_allocation=immediate, mac_address=fa:16:3e:e5:78:19, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:50Z, description=, dns_domain=, id=9b525551-ca72-4804-b27f-0c9808ee3709, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1364532810-network, port_security_enabled=True, project_id=cfaa880c4e9b463d924febc9999ed70c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=26566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=898, status=ACTIVE, subnets=['f7a52947-c77e-4aef-90af-87931200bfce'], tags=[], tenant_id=cfaa880c4e9b463d924febc9999ed70c, updated_at=2025-12-06T10:18:51Z, vlan_transparent=None, network_id=9b525551-ca72-4804-b27f-0c9808ee3709, port_security_enabled=False, project_id=cfaa880c4e9b463d924febc9999ed70c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=912, status=DOWN, tags=[], tenant_id=cfaa880c4e9b463d924febc9999ed70c, updated_at=2025-12-06T10:18:57Z on network 9b525551-ca72-4804-b27f-0c9808ee3709#033[00m Dec 6 05:18:59 localhost dnsmasq[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/addn_hosts - 1 addresses Dec 6 05:18:59 localhost podman[248640]: 2025-12-06 10:18:59.844597187 +0000 UTC m=+0.058439953 container kill 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:18:59 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/host Dec 6 05:18:59 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/opts Dec 6 05:19:00 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:00.415 219384 INFO neutron.agent.dhcp.agent [None req-1864ee9b-db80-4159-9ad6-ff719df4d61d - - - - - -] DHCP configuration for ports {'35035436-4557-4de1-b5f4-bfe7e71862be'} is completed#033[00m Dec 6 05:19:00 localhost nova_compute[237281]: 2025-12-06 10:19:00.429 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:00 localhost nova_compute[237281]: 2025-12-06 10:19:00.977 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Successfully updated port: 01e51dda-8924-4663-86dd-9d9e200f064e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m Dec 6 05:19:00 localhost nova_compute[237281]: 2025-12-06 10:19:00.996 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:19:00 localhost nova_compute[237281]: 2025-12-06 10:19:00.996 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquired lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:19:00 localhost nova_compute[237281]: 2025-12-06 10:19:00.997 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 6 05:19:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26311 DF PROTO=TCP SPT=41586 DPT=9102 SEQ=2662004881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDA4A880000000001030307) Dec 6 05:19:01 localhost nova_compute[237281]: 2025-12-06 10:19:01.950 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:19:01 localhost nova_compute[237281]: 2025-12-06 10:19:01.982 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:19:01 localhost nova_compute[237281]: 2025-12-06 10:19:01.982 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:19:01 localhost nova_compute[237281]: 2025-12-06 10:19:01.983 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:01 localhost nova_compute[237281]: 2025-12-06 10:19:01.983 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.021 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.022 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.022 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.023 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.118 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.155 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.193 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.195 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.266 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.268 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.339 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.341 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.413 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:19:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:19:02 localhost systemd[1]: tmp-crun.YGkjhK.mount: Deactivated successfully. Dec 6 05:19:02 localhost podman[248673]: 2025-12-06 10:19:02.571102542 +0000 UTC m=+0.095833319 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:02 localhost podman[248672]: 2025-12-06 10:19:02.613719666 +0000 UTC m=+0.141119985 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:19:02 localhost podman[248673]: 2025-12-06 10:19:02.633008881 +0000 UTC m=+0.157739668 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:19:02 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:19:02 localhost podman[248672]: 2025-12-06 10:19:02.652182943 +0000 UTC m=+0.179583272 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:19:02 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.671 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.672 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12299MB free_disk=387.26635360717773GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.673 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.673 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.774 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.774 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a762ea39-4184-4fba-8fd6-e4390fdf75fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.775 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.775 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1152MB phys_disk=399GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.874 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.897 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.930 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:19:02 localhost nova_compute[237281]: 2025-12-06 10:19:02.931 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:03 localhost nova_compute[237281]: 2025-12-06 10:19:03.736 237285 DEBUG nova.compute.manager [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received event network-changed-01e51dda-8924-4663-86dd-9d9e200f064e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:19:03 localhost nova_compute[237281]: 2025-12-06 10:19:03.736 237285 DEBUG nova.compute.manager [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Refreshing instance network info cache due to event network-changed-01e51dda-8924-4663-86dd-9d9e200f064e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Dec 6 05:19:03 localhost nova_compute[237281]: 2025-12-06 10:19:03.737 237285 DEBUG oslo_concurrency.lockutils [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:19:04 localhost nova_compute[237281]: 2025-12-06 10:19:04.113 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:05 localhost nova_compute[237281]: 2025-12-06 10:19:05.459 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:05 localhost nova_compute[237281]: 2025-12-06 10:19:05.989 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Updating instance_info_cache with network_info: [{"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.013 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Releasing lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.014 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance network_info: |[{"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.014 237285 DEBUG oslo_concurrency.lockutils [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquired lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.014 237285 DEBUG nova.network.neutron [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Refreshing network info cache for port 01e51dda-8924-4663-86dd-9d9e200f064e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.020 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Start _get_guest_xml network_info=[{"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:57Z,direct_url=,disk_format='qcow2',id=8eeec8d4-c6be-4c95-9cb2-1a047e96c028,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='47835b89168945138751a4b216280589',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-12-06T10:13:59Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'image_id': '8eeec8d4-c6be-4c95-9cb2-1a047e96c028'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.026 237285 WARNING nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.029 237285 DEBUG nova.virt.libvirt.host [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Searching host: 'np0005548798.ooo.test' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.030 237285 DEBUG nova.virt.libvirt.host [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.032 237285 DEBUG nova.virt.libvirt.host [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Searching host: 'np0005548798.ooo.test' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.033 237285 DEBUG nova.virt.libvirt.host [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.034 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.034 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T10:13:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='de034496-40b7-4669-ab81-19110fbda990',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-06T10:13:57Z,direct_url=,disk_format='qcow2',id=8eeec8d4-c6be-4c95-9cb2-1a047e96c028,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='47835b89168945138751a4b216280589',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-12-06T10:13:59Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.035 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.035 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.036 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.037 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.037 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.037 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.038 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.038 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.039 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.039 237285 DEBUG nova.virt.hardware [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.046 237285 DEBUG nova.virt.libvirt.vif [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:18:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1705009981',display_name='tempest-DeleteServersTestJSON-server-1705009981',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005548798.ooo.test',hostname='tempest-deleteserverstestjson-server-1705009981',id=11,image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9def7e7271f8404db90dc0d9d3faf8c3',ramdisk_id='',reservation_id='r-s4t0fz6i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1679703708',owner_user_name='tempest-DeleteServersTestJSON-1679703708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:18:52Z,user_data=None,user_id='407472656c6040578cb2bdc3b4288953',uuid=a762ea39-4184-4fba-8fd6-e4390fdf75fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.047 237285 DEBUG nova.network.os_vif_util [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Converting VIF {"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.048 237285 DEBUG nova.network.os_vif_util [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:31:23,bridge_name='br-int',has_traffic_filtering=True,id=01e51dda-8924-4663-86dd-9d9e200f064e,network=Network(d2616f3b-ebf7-4ff7-989a-234db45c7a91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01e51dda-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.051 237285 DEBUG nova.objects.instance [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lazy-loading 'pci_devices' on Instance uuid a762ea39-4184-4fba-8fd6-e4390fdf75fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.068 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] End _get_guest_xml xml= Dec 6 05:19:06 localhost nova_compute[237281]: a762ea39-4184-4fba-8fd6-e4390fdf75fd Dec 6 05:19:06 localhost nova_compute[237281]: instance-0000000b Dec 6 05:19:06 localhost nova_compute[237281]: 131072 Dec 6 05:19:06 localhost nova_compute[237281]: 1 Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: tempest-DeleteServersTestJSON-server-1705009981 Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06 Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: 128 Dec 6 05:19:06 localhost nova_compute[237281]: 1 Dec 6 05:19:06 localhost nova_compute[237281]: 0 Dec 6 05:19:06 localhost nova_compute[237281]: 0 Dec 6 05:19:06 localhost nova_compute[237281]: 1 Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: tempest-DeleteServersTestJSON-1679703708-project-member Dec 6 05:19:06 localhost nova_compute[237281]: tempest-DeleteServersTestJSON-1679703708 Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: RDO Dec 6 05:19:06 localhost nova_compute[237281]: OpenStack Compute Dec 6 05:19:06 localhost nova_compute[237281]: 27.5.2-0.20250829104910.6f8decf.el9 Dec 6 05:19:06 localhost nova_compute[237281]: a762ea39-4184-4fba-8fd6-e4390fdf75fd Dec 6 05:19:06 localhost nova_compute[237281]: a762ea39-4184-4fba-8fd6-e4390fdf75fd Dec 6 05:19:06 localhost nova_compute[237281]: Virtual Machine Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: hvm Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: /dev/urandom Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: Dec 6 05:19:06 localhost nova_compute[237281]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.070 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Preparing to wait for external event network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.071 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.072 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.072 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.073 237285 DEBUG nova.virt.libvirt.vif [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T10:18:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1705009981',display_name='tempest-DeleteServersTestJSON-server-1705009981',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005548798.ooo.test',hostname='tempest-deleteserverstestjson-server-1705009981',id=11,image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9def7e7271f8404db90dc0d9d3faf8c3',ramdisk_id='',reservation_id='r-s4t0fz6i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1679703708',owner_user_name='tempest-DeleteServersTestJSON-1679703708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:18:52Z,user_data=None,user_id='407472656c6040578cb2bdc3b4288953',uuid=a762ea39-4184-4fba-8fd6-e4390fdf75fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.073 237285 DEBUG nova.network.os_vif_util [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Converting VIF {"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.075 237285 DEBUG nova.network.os_vif_util [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:31:23,bridge_name='br-int',has_traffic_filtering=True,id=01e51dda-8924-4663-86dd-9d9e200f064e,network=Network(d2616f3b-ebf7-4ff7-989a-234db45c7a91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01e51dda-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.075 237285 DEBUG os_vif [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:31:23,bridge_name='br-int',has_traffic_filtering=True,id=01e51dda-8924-4663-86dd-9d9e200f064e,network=Network(d2616f3b-ebf7-4ff7-989a-234db45c7a91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01e51dda-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.076 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.077 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.077 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.081 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.082 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01e51dda-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.082 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01e51dda-89, col_values=(('external_ids', {'iface-id': '01e51dda-8924-4663-86dd-9d9e200f064e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:31:23', 'vm-uuid': 'a762ea39-4184-4fba-8fd6-e4390fdf75fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.084 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.088 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.092 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.093 237285 INFO os_vif [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:31:23,bridge_name='br-int',has_traffic_filtering=True,id=01e51dda-8924-4663-86dd-9d9e200f064e,network=Network(d2616f3b-ebf7-4ff7-989a-234db45c7a91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01e51dda-89')#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.157 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.157 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.158 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] No VIF found with MAC fa:16:3e:c3:31:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m Dec 6 05:19:06 localhost nova_compute[237281]: 2025-12-06 10:19:06.159 237285 INFO nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Using config drive#033[00m Dec 6 05:19:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:06.703 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:06.703 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:06.704 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:19:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:19:07 localhost podman[248715]: 2025-12-06 10:19:07.541515352 +0000 UTC m=+0.077187272 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent) Dec 6 05:19:07 localhost podman[248716]: 2025-12-06 10:19:07.600018877 +0000 UTC m=+0.130632981 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible) Dec 6 05:19:07 localhost podman[248716]: 2025-12-06 10:19:07.615617118 +0000 UTC m=+0.146231262 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 05:19:07 localhost podman[248715]: 2025-12-06 10:19:07.628070153 +0000 UTC m=+0.163742073 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:19:07 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:19:07 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.612 237285 INFO nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Creating config drive at /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk.config#033[00m Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.618 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq_qn7mrj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.746 237285 DEBUG oslo_concurrency.processutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq_qn7mrj" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:08 localhost kernel: device tap01e51dda-89 entered promiscuous mode Dec 6 05:19:08 localhost NetworkManager[5965]: [1765016348.8104] manager: (tap01e51dda-89): new Tun device (/org/freedesktop/NetworkManager/Devices/28) Dec 6 05:19:08 localhost systemd-udevd[248764]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.816 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:08 localhost ovn_controller[131684]: 2025-12-06T10:19:08Z|00149|binding|INFO|Claiming lport 01e51dda-8924-4663-86dd-9d9e200f064e for this chassis. Dec 6 05:19:08 localhost ovn_controller[131684]: 2025-12-06T10:19:08Z|00150|binding|INFO|01e51dda-8924-4663-86dd-9d9e200f064e: Claiming fa:16:3e:c3:31:23 10.100.0.11 Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.824 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.825 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:08 localhost ovn_controller[131684]: 2025-12-06T10:19:08Z|00151|binding|INFO|Setting lport 01e51dda-8924-4663-86dd-9d9e200f064e ovn-installed in OVS Dec 6 05:19:08 localhost ovn_controller[131684]: 2025-12-06T10:19:08Z|00152|binding|INFO|Setting lport 01e51dda-8924-4663-86dd-9d9e200f064e up in Southbound Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.826 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:31:23 10.100.0.11'], port_security=['fa:16:3e:c3:31:23 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a762ea39-4184-4fba-8fd6-e4390fdf75fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9def7e7271f8404db90dc0d9d3faf8c3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a3402b43-c129-4c99-9939-dcb84895acdb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9cb5a99-db72-48ab-8670-f6428ddb0d8d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=01e51dda-8924-4663-86dd-9d9e200f064e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.828 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.829 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 01e51dda-8924-4663-86dd-9d9e200f064e in datapath d2616f3b-ebf7-4ff7-989a-234db45c7a91 bound to our chassis#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.833 137259 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2616f3b-ebf7-4ff7-989a-234db45c7a91#033[00m Dec 6 05:19:08 localhost NetworkManager[5965]: [1765016348.8366] device (tap01e51dda-89): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 05:19:08 localhost NetworkManager[5965]: [1765016348.8375] device (tap01e51dda-89): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 6 05:19:08 localhost nova_compute[237281]: 2025-12-06 10:19:08.837 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.849 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[49d70890-6415-4375-9c78-b3d1a06ee8c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.850 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd2616f3b-e1 in ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 6 05:19:08 localhost systemd-machined[68273]: New machine qemu-4-instance-0000000b. Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.855 137360 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd2616f3b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.856 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[06d59684-45e2-447d-b10a-c823c8fd28b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.857 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d8ea9230-236c-4a9a-bc4d-dc144d3df715]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost systemd[1]: Started Virtual Machine qemu-4-instance-0000000b. Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.868 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[b34b8a6e-7678-4027-a657-7edbf7472927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.884 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[af4714f1-3978-41bf-892e-73c2892f0e92]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.912 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[45117720-dbb7-4ea6-8dab-315c90fc6548]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.918 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[085bb477-1f0d-46b6-8e83-11bd733dfbe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost systemd-udevd[248768]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:19:08 localhost NetworkManager[5965]: [1765016348.9202] manager: (tapd2616f3b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/29) Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.954 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[aa148d66-7894-4291-b354-d68d41d907cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.957 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[476bf0d5-66f6-4a64-84ec-950b708dbd6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:08 localhost NetworkManager[5965]: [1765016348.9795] device (tapd2616f3b-e0): carrier: link connected Dec 6 05:19:08 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapd2616f3b-e1: link becomes ready Dec 6 05:19:08 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapd2616f3b-e0: link becomes ready Dec 6 05:19:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:08.986 137371 DEBUG oslo.privsep.daemon [-] privsep: reply[752854c9-8796-4333-af6c-4e8061ac0f9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.004 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e932cc97-e99b-419e-896c-dcce2f678557]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2616f3b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:96:77:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1273611, 'reachable_time': 35070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248800, 'error': None, 'target': 'ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.021 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe411bd-b299-415c-9650-8938899847c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe96:771b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1273611, 'tstamp': 1273611}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248801, 'error': None, 'target': 'ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.038 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3cdbf9-1e50-4868-a7fb-64bc1f94f982]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2616f3b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:96:77:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1273611, 'reachable_time': 35070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248802, 'error': None, 'target': 'ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.066 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9a66c1b9-cff0-44d5-8042-a7d607b5f91e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.118 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb9258d-1ce8-4187-9a93-9bddedab8a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.121 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2616f3b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.122 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.123 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2616f3b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:09 localhost kernel: device tapd2616f3b-e0 entered promiscuous mode Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.158 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.162 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.164 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2616f3b-e0, col_values=(('external_ids', {'iface-id': '8d976a5b-f122-4262-9134-6bc993b87f15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:09 localhost ovn_controller[131684]: 2025-12-06T10:19:09Z|00153|binding|INFO|Releasing lport 8d976a5b-f122-4262-9134-6bc993b87f15 from this chassis (sb_readonly=0) Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.168 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.169 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.170 137259 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d2616f3b-ebf7-4ff7-989a-234db45c7a91.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d2616f3b-ebf7-4ff7-989a-234db45c7a91.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.174 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ec12c180-51f5-4d3a-96e6-70b9307210fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.175 137259 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: global Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: log /dev/log local0 debug Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: log-tag haproxy-metadata-proxy-d2616f3b-ebf7-4ff7-989a-234db45c7a91 Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: user root Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: group root Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: maxconn 1024 Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: pidfile /var/lib/neutron/external/pids/d2616f3b-ebf7-4ff7-989a-234db45c7a91.pid.haproxy Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: daemon Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: defaults Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: log global Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: mode http Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: option httplog Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: option dontlognull Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: option http-server-close Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: option forwardfor Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: retries 3 Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: timeout http-request 30s Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: timeout connect 30s Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: timeout client 32s Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: timeout server 32s Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: timeout http-keep-alive 30s Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: listen listener Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: bind 169.254.169.254:80 Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: server metadata /var/lib/neutron/metadata_proxy Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: http-request add-header X-OVN-Network-ID d2616f3b-ebf7-4ff7-989a-234db45c7a91 Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.176 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:09.178 137259 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'env', 'PROCESS_TAG=haproxy-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d2616f3b-ebf7-4ff7-989a-234db45c7a91.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.328 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.328 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] VM Started (Lifecycle Event)#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.351 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.356 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.356 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] VM Paused (Lifecycle Event)#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.385 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.390 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 05:19:09 localhost nova_compute[237281]: 2025-12-06 10:19:09.418 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m Dec 6 05:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26312 DF PROTO=TCP SPT=41586 DPT=9102 SEQ=2662004881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDA6B870000000001030307) Dec 6 05:19:09 localhost podman[248842]: Dec 6 05:19:09 localhost podman[248842]: 2025-12-06 10:19:09.62177873 +0000 UTC m=+0.096357434 container create f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:19:09 localhost systemd[1]: Started libpod-conmon-f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598.scope. Dec 6 05:19:09 localhost systemd[1]: Started libcrun container. Dec 6 05:19:09 localhost podman[248842]: 2025-12-06 10:19:09.576930215 +0000 UTC m=+0.051508989 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 6 05:19:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbd9a8785eceae46b7e629a4e5861a8b94d8e8c5602dd6119e267459bb3aa9d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:19:09 localhost podman[248842]: 2025-12-06 10:19:09.686695223 +0000 UTC m=+0.161273927 container init f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:19:09 localhost podman[248842]: 2025-12-06 10:19:09.695210375 +0000 UTC m=+0.169789079 container start f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:19:09 localhost neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91[248856]: [NOTICE] (248860) : New worker (248862) forked Dec 6 05:19:09 localhost neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91[248856]: [NOTICE] (248860) : Loading success. Dec 6 05:19:10 localhost nova_compute[237281]: 2025-12-06 10:19:10.461 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:11 localhost nova_compute[237281]: 2025-12-06 10:19:11.085 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.003 237285 DEBUG nova.compute.manager [req-97d9346b-b1ec-485e-ae3f-438f30f33af7 req-2a9f6d24-cce7-4f15-8071-9055734b83de 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received event network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.004 237285 DEBUG oslo_concurrency.lockutils [req-97d9346b-b1ec-485e-ae3f-438f30f33af7 req-2a9f6d24-cce7-4f15-8071-9055734b83de 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.004 237285 DEBUG oslo_concurrency.lockutils [req-97d9346b-b1ec-485e-ae3f-438f30f33af7 req-2a9f6d24-cce7-4f15-8071-9055734b83de 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.005 237285 DEBUG oslo_concurrency.lockutils [req-97d9346b-b1ec-485e-ae3f-438f30f33af7 req-2a9f6d24-cce7-4f15-8071-9055734b83de 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.005 237285 DEBUG nova.compute.manager [req-97d9346b-b1ec-485e-ae3f-438f30f33af7 req-2a9f6d24-cce7-4f15-8071-9055734b83de 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Processing event network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.007 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.012 237285 DEBUG nova.virt.driver [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.012 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] VM Resumed (Lifecycle Event)#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.016 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.021 237285 INFO nova.virt.libvirt.driver [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance spawned successfully.#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.021 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.047 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.056 237285 DEBUG nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.079 237285 INFO nova.compute.manager [None req-c20ccd38-76b4-4bc4-bcbd-344504c96e91 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.108 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.108 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.109 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.110 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.110 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.111 237285 DEBUG nova.virt.libvirt.driver [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.181 237285 INFO nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Took 20.49 seconds to spawn the instance on the hypervisor.#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.182 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.194 237285 DEBUG nova.network.neutron [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Updated VIF entry in instance network info cache for port 01e51dda-8924-4663-86dd-9d9e200f064e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.195 237285 DEBUG nova.network.neutron [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Updating instance_info_cache with network_info: [{"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.247 237285 DEBUG oslo_concurrency.lockutils [req-355316ff-982e-4740-bc3d-e5f3bb9ed6a7 req-1a452d12-9d22-4dec-b872-d5af370aa6ec 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Releasing lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.259 237285 DEBUG nova.compute.utils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Conflict updating instance a762ea39-4184-4fba-8fd6-e4390fdf75fd. Expected: {'task_state': ['spawning']}. Actual: {'task_state': 'deleting'} notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.260 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance disappeared during build. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2483#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.260 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.261 237285 DEBUG nova.virt.libvirt.vif [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:18:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1705009981',display_name='tempest-DeleteServersTestJSON-server-1705009981',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005548798.ooo.test',hostname='tempest-deleteserverstestjson-server-1705009981',id=11,image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=2025-12-06T10:19:13Z,launched_on='np0005548798.ooo.test',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005548798.ooo.test',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9def7e7271f8404db90dc0d9d3faf8c3',ramdisk_id='',reservation_id='r-s4t0fz6i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8eeec8d4-c6be-4c95-9cb2-1a047e96c028',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1679703708',owner_user_name='tempest-DeleteServersTestJSON-1679703708-project-member'},tags=TagList,task_state=None,terminated_at=None,trusted_certs=None,updated_at=2025-12-06T10:18:57Z,user_data=None,user_id='407472656c6040578cb2bdc3b4288953',uuid=a762ea39-4184-4fba-8fd6-e4390fdf75fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.262 237285 DEBUG nova.network.os_vif_util [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Converting VIF {"id": "01e51dda-8924-4663-86dd-9d9e200f064e", "address": "fa:16:3e:c3:31:23", "network": {"id": "d2616f3b-ebf7-4ff7-989a-234db45c7a91", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1775521251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "9def7e7271f8404db90dc0d9d3faf8c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01e51dda-89", "ovs_interfaceid": "01e51dda-8924-4663-86dd-9d9e200f064e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.263 237285 DEBUG nova.network.os_vif_util [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:31:23,bridge_name='br-int',has_traffic_filtering=True,id=01e51dda-8924-4663-86dd-9d9e200f064e,network=Network(d2616f3b-ebf7-4ff7-989a-234db45c7a91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01e51dda-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.263 237285 DEBUG os_vif [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:31:23,bridge_name='br-int',has_traffic_filtering=True,id=01e51dda-8924-4663-86dd-9d9e200f064e,network=Network(d2616f3b-ebf7-4ff7-989a-234db45c7a91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01e51dda-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.265 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.266 237285 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01e51dda-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:13 localhost ovn_controller[131684]: 2025-12-06T10:19:13Z|00154|binding|INFO|Releasing lport 01e51dda-8924-4663-86dd-9d9e200f064e from this chassis (sb_readonly=0) Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.269 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost ovn_controller[131684]: 2025-12-06T10:19:13Z|00155|binding|INFO|Setting lport 01e51dda-8924-4663-86dd-9d9e200f064e down in Southbound Dec 6 05:19:13 localhost kernel: device tap01e51dda-89 left promiscuous mode Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.270 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 6 05:19:13 localhost NetworkManager[5965]: [1765016353.2709] device (tap01e51dda-89): state change: disconnected -> unmanaged (reason 'connection-assumed', sys-iface-state: 'external') Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.283 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.291 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:31:23 10.100.0.11'], port_security=['fa:16:3e:c3:31:23 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a762ea39-4184-4fba-8fd6-e4390fdf75fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9def7e7271f8404db90dc0d9d3faf8c3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a3402b43-c129-4c99-9939-dcb84895acdb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9cb5a99-db72-48ab-8670-f6428ddb0d8d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=01e51dda-8924-4663-86dd-9d9e200f064e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.294 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 01e51dda-8924-4663-86dd-9d9e200f064e in datapath d2616f3b-ebf7-4ff7-989a-234db45c7a91 unbound from our chassis#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.297 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d2616f3b-ebf7-4ff7-989a-234db45c7a91, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.298 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.299 237285 INFO os_vif [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:31:23,bridge_name='br-int',has_traffic_filtering=True,id=01e51dda-8924-4663-86dd-9d9e200f064e,network=Network(d2616f3b-ebf7-4ff7-989a-234db45c7a91),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01e51dda-89')#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.300 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.300 237285 DEBUG nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.301 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.299 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[57acb3ba-cef2-4f37-88fb-8b5eae1b6e1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.306 137259 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91 namespace which is not needed anymore#033[00m Dec 6 05:19:13 localhost neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91[248856]: [NOTICE] (248860) : haproxy version is 2.8.14-c23fe91 Dec 6 05:19:13 localhost neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91[248856]: [NOTICE] (248860) : path to executable is /usr/sbin/haproxy Dec 6 05:19:13 localhost neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91[248856]: [WARNING] (248860) : Exiting Master process... Dec 6 05:19:13 localhost neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91[248856]: [ALERT] (248860) : Current worker (248862) exited with code 143 (Terminated) Dec 6 05:19:13 localhost neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91[248856]: [WARNING] (248860) : All workers exited. Exiting... (0) Dec 6 05:19:13 localhost systemd[1]: libpod-f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598.scope: Deactivated successfully. Dec 6 05:19:13 localhost podman[248893]: 2025-12-06 10:19:13.505640578 +0000 UTC m=+0.077516293 container died f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:19:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598-userdata-shm.mount: Deactivated successfully. Dec 6 05:19:13 localhost podman[248893]: 2025-12-06 10:19:13.552764262 +0000 UTC m=+0.124639957 container cleanup f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:19:13 localhost podman[248906]: 2025-12-06 10:19:13.629294913 +0000 UTC m=+0.123051278 container cleanup f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:19:13 localhost systemd[1]: libpod-conmon-f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598.scope: Deactivated successfully. Dec 6 05:19:13 localhost podman[248922]: 2025-12-06 10:19:13.683422252 +0000 UTC m=+0.105699611 container remove f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.690 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d2c57339-ccfb-41d1-9b3a-802436a21d96]: (4, ('Sat Dec 6 10:19:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91 (f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598)\nf105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598\nSat Dec 6 10:19:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91 (f105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598)\nf105d78e2321f927ee446ca01a91e5426c4445624ea172d3fb6a13121bf7d598\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.693 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a23c5e12-02fa-47ff-9551-76f46c5a3b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.694 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2616f3b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.699 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost kernel: device tapd2616f3b-e0 left promiscuous mode Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.708 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost nova_compute[237281]: 2025-12-06 10:19:13.709 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.713 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[f504a07b-570a-4d4a-961a-eca9d85d41a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.729 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[cff2c71a-8bdd-452c-981e-6da1302cf0a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.731 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[3d0c8305-2f84-40ff-b8fe-9109f0c88e5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.751 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[351576ac-d58e-4930-886c-ce123cdbd551]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1273604, 'reachable_time': 37491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248941, 'error': None, 'target': 'ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.756 137391 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d2616f3b-ebf7-4ff7-989a-234db45c7a91 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 6 05:19:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:13.756 137391 DEBUG oslo.privsep.daemon [-] privsep: reply[29b501d0-7e22-4f9f-a7b8-f4b29bc72db4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:14 localhost systemd[1]: var-lib-containers-storage-overlay-cbd9a8785eceae46b7e629a4e5861a8b94d8e8c5602dd6119e267459bb3aa9d4-merged.mount: Deactivated successfully. Dec 6 05:19:14 localhost systemd[1]: run-netns-ovnmeta\x2dd2616f3b\x2debf7\x2d4ff7\x2d989a\x2d234db45c7a91.mount: Deactivated successfully. Dec 6 05:19:15 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:15.409 2 INFO neutron.agent.securitygroups_rpc [req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 req-9692fc3b-4dbe-42b7-802f-b8ed0713d6e4 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Security group member updated ['a3402b43-c129-4c99-9939-dcb84895acdb']#033[00m Dec 6 05:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.497 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.503 237285 DEBUG nova.network.neutron [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.523 237285 INFO nova.compute.manager [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Took 2.22 seconds to deallocate network for instance.#033[00m Dec 6 05:19:15 localhost systemd[1]: tmp-crun.o6GmxN.mount: Deactivated successfully. Dec 6 05:19:15 localhost podman[248942]: 2025-12-06 10:19:15.579732885 +0000 UTC m=+0.108863250 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, config_id=edpm) Dec 6 05:19:15 localhost podman[248942]: 2025-12-06 10:19:15.594959195 +0000 UTC m=+0.124089590 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7) Dec 6 05:19:15 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.651 237285 INFO nova.scheduler.client.report [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Deleted allocations for instance a762ea39-4184-4fba-8fd6-e4390fdf75fd#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.652 237285 DEBUG oslo_concurrency.lockutils [None req-f9194a36-d8cf-4cc8-bedf-f78c99686b74 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 23.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.652 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 18.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.653 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.653 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.654 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.655 237285 INFO nova.compute.manager [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Terminating instance#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.656 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.657 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquired lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.657 237285 DEBUG nova.network.neutron [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 6 05:19:15 localhost nova_compute[237281]: 2025-12-06 10:19:15.931 237285 DEBUG nova.network.neutron [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Dec 6 05:19:16 localhost openstack_network_exporter[199751]: ERROR 10:19:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:19:16 localhost openstack_network_exporter[199751]: ERROR 10:19:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:19:16 localhost openstack_network_exporter[199751]: ERROR 10:19:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:19:16 localhost openstack_network_exporter[199751]: ERROR 10:19:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:19:16 localhost openstack_network_exporter[199751]: Dec 6 05:19:16 localhost openstack_network_exporter[199751]: ERROR 10:19:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:19:16 localhost openstack_network_exporter[199751]: Dec 6 05:19:16 localhost sshd[248963]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:19:16 localhost dnsmasq[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/addn_hosts - 0 addresses Dec 6 05:19:16 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/host Dec 6 05:19:16 localhost podman[248982]: 2025-12-06 10:19:16.537139321 +0000 UTC m=+0.066099410 container kill 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:16 localhost dnsmasq-dhcp[248557]: read /var/lib/neutron/dhcp/9b525551-ca72-4804-b27f-0c9808ee3709/opts Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.681 237285 DEBUG nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received event network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.682 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.682 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.683 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.683 237285 DEBUG nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] No waiting events found dispatching network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.683 237285 WARNING nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received unexpected event network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e for instance with vm_state active and task_state None.#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.684 237285 DEBUG nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received event network-vif-unplugged-01e51dda-8924-4663-86dd-9d9e200f064e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.684 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.684 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.685 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.685 237285 DEBUG nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] No waiting events found dispatching network-vif-unplugged-01e51dda-8924-4663-86dd-9d9e200f064e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.685 237285 WARNING nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received unexpected event network-vif-unplugged-01e51dda-8924-4663-86dd-9d9e200f064e for instance with vm_state active and task_state None.#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.686 237285 DEBUG nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received event network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.686 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Acquiring lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.686 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.687 237285 DEBUG oslo_concurrency.lockutils [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.687 237285 DEBUG nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] No waiting events found dispatching network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.687 237285 WARNING nova.compute.manager [req-b9f30ccd-b65c-40fa-b1fc-6a371adc97c7 req-cea59672-5e46-46cf-a06e-2ba44a48f544 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received unexpected event network-vif-plugged-01e51dda-8924-4663-86dd-9d9e200f064e for instance with vm_state active and task_state None.#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.755 237285 DEBUG nova.network.neutron [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.784 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Releasing lock "refresh_cache-a762ea39-4184-4fba-8fd6-e4390fdf75fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.784 237285 DEBUG nova.compute.manager [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Dec 6 05:19:16 localhost ovn_controller[131684]: 2025-12-06T10:19:16Z|00156|binding|INFO|Releasing lport 2d0b99fc-df7e-4cc1-959f-29a393d1a20b from this chassis (sb_readonly=0) Dec 6 05:19:16 localhost ovn_controller[131684]: 2025-12-06T10:19:16Z|00157|binding|INFO|Setting lport 2d0b99fc-df7e-4cc1-959f-29a393d1a20b down in Southbound Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.816 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:16 localhost kernel: device tap2d0b99fc-df left promiscuous mode Dec 6 05:19:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:16.827 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9b525551-ca72-4804-b27f-0c9808ee3709', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b525551-ca72-4804-b27f-0c9808ee3709', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cfaa880c4e9b463d924febc9999ed70c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ced779d2-888c-4e42-9339-20891d4d231d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2d0b99fc-df7e-4cc1-959f-29a393d1a20b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:16.829 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2d0b99fc-df7e-4cc1-959f-29a393d1a20b in datapath 9b525551-ca72-4804-b27f-0c9808ee3709 unbound from our chassis#033[00m Dec 6 05:19:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:16.831 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b525551-ca72-4804-b27f-0c9808ee3709, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:19:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:16.833 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2d3be942-97d6-47c8-b9ec-7587ccfde656]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:16 localhost nova_compute[237281]: 2025-12-06 10:19:16.847 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:16 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Deactivated successfully. Dec 6 05:19:16 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Consumed 4.223s CPU time. Dec 6 05:19:16 localhost systemd-machined[68273]: Machine qemu-4-instance-0000000b terminated. Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.092 237285 INFO nova.virt.libvirt.driver [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance destroyed successfully.#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.092 237285 DEBUG nova.objects.instance [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lazy-loading 'resources' on Instance uuid a762ea39-4184-4fba-8fd6-e4390fdf75fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.108 237285 INFO nova.virt.libvirt.driver [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Deleting instance files /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd_del#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.109 237285 INFO nova.virt.libvirt.driver [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Deletion of /var/lib/nova/instances/a762ea39-4184-4fba-8fd6-e4390fdf75fd_del complete#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.167 237285 INFO nova.compute.manager [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.168 237285 DEBUG oslo.service.loopingcall [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.168 237285 DEBUG nova.compute.manager [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.168 237285 DEBUG nova.network.neutron [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.572 237285 DEBUG nova.network.neutron [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.591 237285 DEBUG nova.network.neutron [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.605 237285 INFO nova.compute.manager [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Took 0.44 seconds to deallocate network for instance.#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.649 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.650 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.708 237285 DEBUG nova.compute.provider_tree [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.726 237285 DEBUG nova.scheduler.client.report [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.782 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:17 localhost nova_compute[237281]: 2025-12-06 10:19:17.915 237285 DEBUG oslo_concurrency.lockutils [None req-aafd0eeb-b327-4831-b8c8-457f77948303 407472656c6040578cb2bdc3b4288953 9def7e7271f8404db90dc0d9d3faf8c3 - - default default] Lock "a762ea39-4184-4fba-8fd6-e4390fdf75fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:17 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:17.968 2 INFO neutron.agent.securitygroups_rpc [None req-e81421cd-019d-43a9-adb0-2955f7e15a11 cd70e7b6973049b19a5dcdd1afa595ed a4b0697ee0de4af4b6abb9510e493182 - - default default] Security group member updated ['55a68fce-3063-4eef-a6c8-e6db5322dc9c']#033[00m Dec 6 05:19:18 localhost nova_compute[237281]: 2025-12-06 10:19:18.268 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:18 localhost nova_compute[237281]: 2025-12-06 10:19:18.907 237285 DEBUG nova.compute.manager [req-6b734b7c-9cc8-40e6-87da-2f455546d06c req-6b980c11-67de-4804-b0f2-be27ebc0a9a5 1255feb682e0467fbb328fef12859216 34a68ac8cde1445d84ddc1d3c3a0b249 - - default default] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Received event network-vif-deleted-01e51dda-8924-4663-86dd-9d9e200f064e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 6 05:19:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:19:19 localhost podman[249022]: 2025-12-06 10:19:19.548592847 +0000 UTC m=+0.083817537 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:19:19 localhost podman[249022]: 2025-12-06 10:19:19.555223272 +0000 UTC m=+0.090447942 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:19:19 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:19:19 localhost ovn_controller[131684]: 2025-12-06T10:19:19Z|00158|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:19:19 localhost nova_compute[237281]: 2025-12-06 10:19:19.789 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:20 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:20.316 2 INFO neutron.agent.securitygroups_rpc [None req-ebc2f5ff-f715-43db-949e-f161cc0fd06f cd70e7b6973049b19a5dcdd1afa595ed a4b0697ee0de4af4b6abb9510e493182 - - default default] Security group member updated ['55a68fce-3063-4eef-a6c8-e6db5322dc9c']#033[00m Dec 6 05:19:20 localhost nova_compute[237281]: 2025-12-06 10:19:20.498 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:21 localhost dnsmasq[248557]: exiting on receipt of SIGTERM Dec 6 05:19:21 localhost podman[249060]: 2025-12-06 10:19:21.169982408 +0000 UTC m=+0.068972199 container kill 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:21 localhost systemd[1]: libpod-4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a.scope: Deactivated successfully. Dec 6 05:19:21 localhost podman[249075]: 2025-12-06 10:19:21.245053564 +0000 UTC m=+0.058828806 container died 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:19:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a-userdata-shm.mount: Deactivated successfully. Dec 6 05:19:21 localhost podman[249075]: 2025-12-06 10:19:21.27927627 +0000 UTC m=+0.093051481 container cleanup 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:19:21 localhost systemd[1]: libpod-conmon-4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a.scope: Deactivated successfully. Dec 6 05:19:21 localhost podman[249077]: 2025-12-06 10:19:21.328492078 +0000 UTC m=+0.136905014 container remove 4407916d94060b2130e0678e7ad2a92b83104c2d770a1caafa625ed453d3c10a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b525551-ca72-4804-b27f-0c9808ee3709, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 05:19:22 localhost systemd[1]: var-lib-containers-storage-overlay-4a4b9ec70a121a668f5379c3620a940a44c88c9e2d75299dc6ce16bedf222d64-merged.mount: Deactivated successfully. Dec 6 05:19:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:22.460 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:22.461 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:19:22 localhost nova_compute[237281]: 2025-12-06 10:19:22.504 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:22.521 219384 INFO neutron.agent.dhcp.agent [None req-4114f14a-bc86-4409-96ef-40acf4ae13bf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:19:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:22.522 219384 INFO neutron.agent.dhcp.agent [None req-4114f14a-bc86-4409-96ef-40acf4ae13bf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:19:22 localhost systemd[1]: run-netns-qdhcp\x2d9b525551\x2dca72\x2d4804\x2db27f\x2d0c9808ee3709.mount: Deactivated successfully. Dec 6 05:19:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:22.917 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:19:23 localhost nova_compute[237281]: 2025-12-06 10:19:23.270 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:23 localhost podman[197801]: time="2025-12-06T10:19:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:19:23 localhost podman[197801]: @ - - [06/Dec/2025:10:19:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:19:23 localhost podman[197801]: @ - - [06/Dec/2025:10:19:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15966 "" "Go-http-client/1.1" Dec 6 05:19:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7930 DF PROTO=TCP SPT=51758 DPT=9102 SEQ=3328776850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDAA3EA0000000001030307) Dec 6 05:19:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7931 DF PROTO=TCP SPT=51758 DPT=9102 SEQ=3328776850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDAA8070000000001030307) Dec 6 05:19:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:25.464 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:19:25 localhost nova_compute[237281]: 2025-12-06 10:19:25.501 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:25 localhost ovn_controller[131684]: 2025-12-06T10:19:25Z|00159|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:19:25 localhost nova_compute[237281]: 2025-12-06 10:19:25.836 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26313 DF PROTO=TCP SPT=41586 DPT=9102 SEQ=2662004881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDAAB870000000001030307) Dec 6 05:19:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7932 DF PROTO=TCP SPT=51758 DPT=9102 SEQ=3328776850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDAB0070000000001030307) Dec 6 05:19:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62304 DF PROTO=TCP SPT=55600 DPT=9102 SEQ=958516846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDAB3870000000001030307) Dec 6 05:19:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:28.117 2 INFO neutron.agent.securitygroups_rpc [req-783b4daf-26d2-4823-b373-ff82c7eab0bb req-2359cd05-f101-4320-8bb9-73cf1d97a7fe de34ca65371d4e6a903edd70cb8e9c20 bebfb6087b6b4b7aa357cc1b142247ff - - default default] Security group member updated ['78ae70f9-ce5b-485f-8f5c-b606f7261a23']#033[00m Dec 6 05:19:28 localhost nova_compute[237281]: 2025-12-06 10:19:28.271 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:19:28 localhost podman[249106]: 2025-12-06 10:19:28.54755882 +0000 UTC m=+0.083607680 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:28 localhost podman[249106]: 2025-12-06 10:19:28.585982556 +0000 UTC m=+0.122031406 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller) Dec 6 05:19:28 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:19:30 localhost nova_compute[237281]: 2025-12-06 10:19:30.504 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7933 DF PROTO=TCP SPT=51758 DPT=9102 SEQ=3328776850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDABFC80000000001030307) Dec 6 05:19:32 localhost nova_compute[237281]: 2025-12-06 10:19:32.091 237285 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 6 05:19:32 localhost nova_compute[237281]: 2025-12-06 10:19:32.092 237285 INFO nova.compute.manager [-] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] VM Stopped (Lifecycle Event)#033[00m Dec 6 05:19:32 localhost nova_compute[237281]: 2025-12-06 10:19:32.156 237285 DEBUG nova.compute.manager [None req-f960d600-b7f6-4c81-9a75-0bbb7d9d0644 - - - - - -] [instance: a762ea39-4184-4fba-8fd6-e4390fdf75fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 6 05:19:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:19:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:19:33 localhost systemd[1]: tmp-crun.1mdvjZ.mount: Deactivated successfully. Dec 6 05:19:33 localhost podman[249132]: 2025-12-06 10:19:33.255530985 +0000 UTC m=+0.090643988 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:19:33 localhost podman[249132]: 2025-12-06 10:19:33.266195154 +0000 UTC m=+0.101308197 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:19:33 localhost nova_compute[237281]: 2025-12-06 10:19:33.272 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:33 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:19:33 localhost podman[249133]: 2025-12-06 10:19:33.362590387 +0000 UTC m=+0.193575583 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible) Dec 6 05:19:33 localhost podman[249133]: 2025-12-06 10:19:33.376227238 +0000 UTC m=+0.207212504 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:19:33 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:19:34 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:34.053 219384 INFO neutron.agent.linux.ip_lib [None req-3e29d028-fe77-448f-b3ce-09a04b479737 - - - - - -] Device tap3e85bb1e-dd cannot be used as it has no MAC address#033[00m Dec 6 05:19:34 localhost nova_compute[237281]: 2025-12-06 10:19:34.075 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:34 localhost kernel: device tap3e85bb1e-dd entered promiscuous mode Dec 6 05:19:34 localhost ovn_controller[131684]: 2025-12-06T10:19:34Z|00160|binding|INFO|Claiming lport 3e85bb1e-dd54-4070-86e2-509b4fb8404b for this chassis. Dec 6 05:19:34 localhost nova_compute[237281]: 2025-12-06 10:19:34.083 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:34 localhost ovn_controller[131684]: 2025-12-06T10:19:34Z|00161|binding|INFO|3e85bb1e-dd54-4070-86e2-509b4fb8404b: Claiming unknown Dec 6 05:19:34 localhost NetworkManager[5965]: [1765016374.0866] manager: (tap3e85bb1e-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Dec 6 05:19:34 localhost systemd-udevd[249185]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:19:34 localhost nova_compute[237281]: 2025-12-06 10:19:34.100 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:34 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:34.111 2 INFO neutron.agent.securitygroups_rpc [None req-e05bf322-ee7b-4c8c-b5af-740caae2668f a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:19:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:34.111 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3e85bb1e-dd54-4070-86e2-509b4fb8404b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:34.113 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 3e85bb1e-dd54-4070-86e2-509b4fb8404b in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:19:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:34.114 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:19:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:34.115 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6d20c65e-b2bf-4ee2-ba01-3ad28a18d5aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost ovn_controller[131684]: 2025-12-06T10:19:34Z|00162|binding|INFO|Setting lport 3e85bb1e-dd54-4070-86e2-509b4fb8404b ovn-installed in OVS Dec 6 05:19:34 localhost ovn_controller[131684]: 2025-12-06T10:19:34Z|00163|binding|INFO|Setting lport 3e85bb1e-dd54-4070-86e2-509b4fb8404b up in Southbound Dec 6 05:19:34 localhost nova_compute[237281]: 2025-12-06 10:19:34.119 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost journal[186952]: ethtool ioctl error on tap3e85bb1e-dd: No such device Dec 6 05:19:34 localhost nova_compute[237281]: 2025-12-06 10:19:34.163 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:34 localhost nova_compute[237281]: 2025-12-06 10:19:34.195 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:34 localhost systemd[1]: tmp-crun.WcLItd.mount: Deactivated successfully. Dec 6 05:19:35 localhost podman[249255]: Dec 6 05:19:35 localhost podman[249255]: 2025-12-06 10:19:35.329190638 +0000 UTC m=+0.066181972 container create 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:19:35 localhost systemd[1]: Started libpod-conmon-2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad.scope. Dec 6 05:19:35 localhost systemd[1]: tmp-crun.8oY40m.mount: Deactivated successfully. Dec 6 05:19:35 localhost systemd[1]: Started libcrun container. Dec 6 05:19:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b2ca8ba740b0c8dc61b64bd90fbb1e197c9eeb616911758248c484a5d70fe6a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:19:35 localhost podman[249255]: 2025-12-06 10:19:35.292406254 +0000 UTC m=+0.029397658 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:19:35 localhost podman[249255]: 2025-12-06 10:19:35.397642581 +0000 UTC m=+0.134633935 container init 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:19:35 localhost podman[249255]: 2025-12-06 10:19:35.40445424 +0000 UTC m=+0.141445604 container start 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:19:35 localhost dnsmasq[249273]: started, version 2.85 cachesize 150 Dec 6 05:19:35 localhost dnsmasq[249273]: DNS service limited to local subnets Dec 6 05:19:35 localhost dnsmasq[249273]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:19:35 localhost dnsmasq[249273]: warning: no upstream servers configured Dec 6 05:19:35 localhost dnsmasq-dhcp[249273]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:19:35 localhost dnsmasq[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:19:35 localhost dnsmasq-dhcp[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:19:35 localhost dnsmasq-dhcp[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:19:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:35.462 219384 INFO neutron.agent.dhcp.agent [None req-3e29d028-fe77-448f-b3ce-09a04b479737 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:33Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0f6fb4b2-5ba1-4f18-9ada-6840e0c9f9c8, ip_allocation=immediate, mac_address=fa:16:3e:4d:52:73, name=tempest-NetworksTestDHCPv6-831382719, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['abb0dcc1-eb74-448a-8db9-da152c500078'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:19:32Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1011, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:19:33Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:19:35 localhost nova_compute[237281]: 2025-12-06 10:19:35.542 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:35.594 219384 INFO neutron.agent.dhcp.agent [None req-d59a35ef-6150-4359-9ec7-78e36c63c735 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:19:35 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:35.693 2 INFO neutron.agent.securitygroups_rpc [None req-2d7c5033-2d8b-487a-a9bc-6da75bf3ca81 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:19:35 localhost dnsmasq[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:19:35 localhost dnsmasq-dhcp[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:19:35 localhost dnsmasq-dhcp[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:19:35 localhost podman[249292]: 2025-12-06 10:19:35.702779864 +0000 UTC m=+0.054584665 container kill 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:19:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:35.913 219384 INFO neutron.agent.dhcp.agent [None req-e4d2b909-d57d-41b5-b89f-3efcbf1ea87e - - - - - -] DHCP configuration for ports {'0f6fb4b2-5ba1-4f18-9ada-6840e0c9f9c8'} is completed#033[00m Dec 6 05:19:36 localhost dnsmasq[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:19:36 localhost dnsmasq-dhcp[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:19:36 localhost podman[249328]: 2025-12-06 10:19:36.038664016 +0000 UTC m=+0.062152829 container kill 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:19:36 localhost dnsmasq-dhcp[249273]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:19:36 localhost ovn_controller[131684]: 2025-12-06T10:19:36Z|00164|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:19:36 localhost nova_compute[237281]: 2025-12-06 10:19:36.505 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:37 localhost dnsmasq[249273]: exiting on receipt of SIGTERM Dec 6 05:19:37 localhost systemd[1]: libpod-2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad.scope: Deactivated successfully. Dec 6 05:19:37 localhost podman[249367]: 2025-12-06 10:19:37.521915885 +0000 UTC m=+0.063123719 container kill 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:19:37 localhost podman[249379]: 2025-12-06 10:19:37.592947406 +0000 UTC m=+0.058790775 container died 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:37 localhost systemd[1]: tmp-crun.djb0dE.mount: Deactivated successfully. Dec 6 05:19:37 localhost podman[249379]: 2025-12-06 10:19:37.626432979 +0000 UTC m=+0.092276248 container cleanup 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:19:37 localhost systemd[1]: libpod-conmon-2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad.scope: Deactivated successfully. Dec 6 05:19:37 localhost podman[249381]: 2025-12-06 10:19:37.679365512 +0000 UTC m=+0.136282416 container remove 2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:19:37 localhost ovn_controller[131684]: 2025-12-06T10:19:37Z|00165|binding|INFO|Releasing lport 3e85bb1e-dd54-4070-86e2-509b4fb8404b from this chassis (sb_readonly=0) Dec 6 05:19:37 localhost kernel: device tap3e85bb1e-dd left promiscuous mode Dec 6 05:19:37 localhost ovn_controller[131684]: 2025-12-06T10:19:37Z|00166|binding|INFO|Setting lport 3e85bb1e-dd54-4070-86e2-509b4fb8404b down in Southbound Dec 6 05:19:37 localhost nova_compute[237281]: 2025-12-06 10:19:37.733 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:37 localhost nova_compute[237281]: 2025-12-06 10:19:37.749 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:19:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:19:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:37.761 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3e85bb1e-dd54-4070-86e2-509b4fb8404b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:37.762 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 3e85bb1e-dd54-4070-86e2-509b4fb8404b in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:19:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:37.763 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:19:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:37.763 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[44c7cc31-86bf-49ef-83b4-0305451ddd59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:37 localhost podman[249410]: 2025-12-06 10:19:37.830292408 +0000 UTC m=+0.071686243 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent) Dec 6 05:19:37 localhost podman[249411]: 2025-12-06 10:19:37.850703278 +0000 UTC m=+0.085608742 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:19:37 localhost podman[249411]: 2025-12-06 10:19:37.864573506 +0000 UTC m=+0.099478980 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:19:37 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:19:37 localhost podman[249410]: 2025-12-06 10:19:37.915374503 +0000 UTC m=+0.156768367 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:19:37 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:19:38 localhost nova_compute[237281]: 2025-12-06 10:19:38.274 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:38.297 219384 INFO neutron.agent.dhcp.agent [None req-ed65e753-600a-4f02-97b5-a8f858400da6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:19:38 localhost systemd[1]: var-lib-containers-storage-overlay-8b2ca8ba740b0c8dc61b64bd90fbb1e197c9eeb616911758248c484a5d70fe6a-merged.mount: Deactivated successfully. Dec 6 05:19:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2139beda6c5b2a710184aa38f317b59cf8c14b2b65095b9603ca984ebe8ea3ad-userdata-shm.mount: Deactivated successfully. Dec 6 05:19:38 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7934 DF PROTO=TCP SPT=51758 DPT=9102 SEQ=3328776850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDADF870000000001030307) Dec 6 05:19:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:40.100 219384 INFO neutron.agent.linux.ip_lib [None req-9683b59c-f0bd-4e46-b1a7-9c4c361b3eac - - - - - -] Device tap2e65643c-02 cannot be used as it has no MAC address#033[00m Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.125 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost kernel: device tap2e65643c-02 entered promiscuous mode Dec 6 05:19:40 localhost NetworkManager[5965]: [1765016380.1323] manager: (tap2e65643c-02): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Dec 6 05:19:40 localhost ovn_controller[131684]: 2025-12-06T10:19:40Z|00167|binding|INFO|Claiming lport 2e65643c-02bc-4dae-b54b-86831a0c139c for this chassis. Dec 6 05:19:40 localhost ovn_controller[131684]: 2025-12-06T10:19:40Z|00168|binding|INFO|2e65643c-02bc-4dae-b54b-86831a0c139c: Claiming unknown Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.133 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost systemd-udevd[249458]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:19:40 localhost ovn_controller[131684]: 2025-12-06T10:19:40Z|00169|binding|INFO|Setting lport 2e65643c-02bc-4dae-b54b-86831a0c139c ovn-installed in OVS Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.141 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.144 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost ovn_controller[131684]: 2025-12-06T10:19:40Z|00170|binding|INFO|Setting lport 2e65643c-02bc-4dae-b54b-86831a0c139c up in Southbound Dec 6 05:19:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:40.149 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2e65643c-02bc-4dae-b54b-86831a0c139c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:40.151 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2e65643c-02bc-4dae-b54b-86831a0c139c in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:19:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:40.153 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:19:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:40.157 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[8708aef3-831b-40b4-965a-5d5b5e2a8b13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.165 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost journal[186952]: ethtool ioctl error on tap2e65643c-02: No such device Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.206 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.236 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:40.390 2 INFO neutron.agent.securitygroups_rpc [None req-69bb79e1-1f20-4e95-a34a-ed613e0dd91d a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:19:40 localhost nova_compute[237281]: 2025-12-06 10:19:40.545 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:40 localhost podman[249529]: Dec 6 05:19:41 localhost podman[249529]: 2025-12-06 10:19:41.009616182 +0000 UTC m=+0.094987781 container create 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:19:41 localhost systemd[1]: Started libpod-conmon-01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e.scope. Dec 6 05:19:41 localhost podman[249529]: 2025-12-06 10:19:40.967019088 +0000 UTC m=+0.052390747 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:19:41 localhost systemd[1]: Started libcrun container. Dec 6 05:19:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b0d95332b1ee5b97b2a50bb831218a0a47bb9661314f5be56a4463bd625e2d5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:19:41 localhost podman[249529]: 2025-12-06 10:19:41.084293716 +0000 UTC m=+0.169665285 container init 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:19:41 localhost podman[249529]: 2025-12-06 10:19:41.096907725 +0000 UTC m=+0.182279354 container start 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:19:41 localhost dnsmasq[249547]: started, version 2.85 cachesize 150 Dec 6 05:19:41 localhost dnsmasq[249547]: DNS service limited to local subnets Dec 6 05:19:41 localhost dnsmasq[249547]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:19:41 localhost dnsmasq[249547]: warning: no upstream servers configured Dec 6 05:19:41 localhost dnsmasq[249547]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:19:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:41.160 219384 INFO neutron.agent.dhcp.agent [None req-9683b59c-f0bd-4e46-b1a7-9c4c361b3eac - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:40Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=40d86dc6-8a5e-4122-a6a1-2db78e809b28, ip_allocation=immediate, mac_address=fa:16:3e:29:da:7d, name=tempest-NetworksTestDHCPv6-879936264, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['fb2d54f2-de22-4a2c-9bf1-37d2f9ffd0b9'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:19:38Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1019, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:19:40Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:19:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:41.311 219384 INFO neutron.agent.dhcp.agent [None req-11d3dc99-2290-4e95-b11c-2de08e941901 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:19:41 localhost dnsmasq[249547]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:19:41 localhost podman[249564]: 2025-12-06 10:19:41.357233186 +0000 UTC m=+0.063845240 container kill 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:19:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:41.619 219384 INFO neutron.agent.dhcp.agent [None req-543c51cf-1c96-45aa-b439-22401354839f - - - - - -] DHCP configuration for ports {'40d86dc6-8a5e-4122-a6a1-2db78e809b28'} is completed#033[00m Dec 6 05:19:42 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:42.482 2 INFO neutron.agent.securitygroups_rpc [None req-20ef5a6d-c0af-4d20-a602-2ee5862bbbee a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:19:42 localhost dnsmasq[249547]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:19:42 localhost podman[249603]: 2025-12-06 10:19:42.686721823 +0000 UTC m=+0.059490827 container kill 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 05:19:43 localhost nova_compute[237281]: 2025-12-06 10:19:43.277 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:43 localhost nova_compute[237281]: 2025-12-06 10:19:43.834 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:44 localhost dnsmasq[249547]: exiting on receipt of SIGTERM Dec 6 05:19:44 localhost podman[249642]: 2025-12-06 10:19:44.185107958 +0000 UTC m=+0.058489524 container kill 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:19:44 localhost systemd[1]: libpod-01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e.scope: Deactivated successfully. Dec 6 05:19:44 localhost podman[249656]: 2025-12-06 10:19:44.25453624 +0000 UTC m=+0.056391200 container died 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:19:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e-userdata-shm.mount: Deactivated successfully. Dec 6 05:19:44 localhost podman[249656]: 2025-12-06 10:19:44.289907982 +0000 UTC m=+0.091762912 container cleanup 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:19:44 localhost systemd[1]: libpod-conmon-01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e.scope: Deactivated successfully. Dec 6 05:19:44 localhost podman[249658]: 2025-12-06 10:19:44.331806615 +0000 UTC m=+0.126645508 container remove 01f6e7c65551f44e353c0de84fcec4edb0cb5d7db8ce7bb3403cfccab2a7022e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:19:44 localhost ovn_controller[131684]: 2025-12-06T10:19:44Z|00171|binding|INFO|Releasing lport 2e65643c-02bc-4dae-b54b-86831a0c139c from this chassis (sb_readonly=0) Dec 6 05:19:44 localhost nova_compute[237281]: 2025-12-06 10:19:44.345 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:44 localhost ovn_controller[131684]: 2025-12-06T10:19:44Z|00172|binding|INFO|Setting lport 2e65643c-02bc-4dae-b54b-86831a0c139c down in Southbound Dec 6 05:19:44 localhost kernel: device tap2e65643c-02 left promiscuous mode Dec 6 05:19:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:44.357 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2e65643c-02bc-4dae-b54b-86831a0c139c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:44.360 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2e65643c-02bc-4dae-b54b-86831a0c139c in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:19:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:44.361 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:19:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:44.362 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6e5e45-9656-4746-abb2-5d782dc8e80f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:44 localhost nova_compute[237281]: 2025-12-06 10:19:44.365 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:45 localhost systemd[1]: var-lib-containers-storage-overlay-6b0d95332b1ee5b97b2a50bb831218a0a47bb9661314f5be56a4463bd625e2d5-merged.mount: Deactivated successfully. Dec 6 05:19:45 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:19:45 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:45.388 219384 INFO neutron.agent.dhcp.agent [None req-9a149413-4df8-4e8f-b6a6-a6b696b6396e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:19:45 localhost nova_compute[237281]: 2025-12-06 10:19:45.547 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:45 localhost nova_compute[237281]: 2025-12-06 10:19:45.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:45 localhost snmpd[56894]: empty variable list in _query Dec 6 05:19:45 localhost snmpd[56894]: empty variable list in _query Dec 6 05:19:46 localhost openstack_network_exporter[199751]: ERROR 10:19:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:19:46 localhost openstack_network_exporter[199751]: ERROR 10:19:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:19:46 localhost openstack_network_exporter[199751]: ERROR 10:19:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:19:46 localhost openstack_network_exporter[199751]: ERROR 10:19:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:19:46 localhost openstack_network_exporter[199751]: Dec 6 05:19:46 localhost openstack_network_exporter[199751]: ERROR 10:19:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:19:46 localhost openstack_network_exporter[199751]: Dec 6 05:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:19:46 localhost podman[249688]: 2025-12-06 10:19:46.559043446 +0000 UTC m=+0.080435743 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, distribution-scope=public) Dec 6 05:19:46 localhost podman[249688]: 2025-12-06 10:19:46.598156262 +0000 UTC m=+0.119548529 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6) Dec 6 05:19:46 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:19:46 localhost nova_compute[237281]: 2025-12-06 10:19:46.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:47 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:47.521 2 INFO neutron.agent.securitygroups_rpc [None req-8499b0df-42fe-4a91-8b04-16e2688bf2ce a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:19:47 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:47.681 219384 INFO neutron.agent.linux.ip_lib [None req-bfa20659-c2f4-48c2-9a0a-be07c63b3b22 - - - - - -] Device tape22f9593-ab cannot be used as it has no MAC address#033[00m Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.703 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:47 localhost kernel: device tape22f9593-ab entered promiscuous mode Dec 6 05:19:47 localhost NetworkManager[5965]: [1765016387.7130] manager: (tape22f9593-ab): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Dec 6 05:19:47 localhost ovn_controller[131684]: 2025-12-06T10:19:47Z|00173|binding|INFO|Claiming lport e22f9593-ab8e-4691-b7fa-014aa44657b8 for this chassis. Dec 6 05:19:47 localhost ovn_controller[131684]: 2025-12-06T10:19:47Z|00174|binding|INFO|e22f9593-ab8e-4691-b7fa-014aa44657b8: Claiming unknown Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.714 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:47 localhost systemd-udevd[249718]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:19:47 localhost ovn_controller[131684]: 2025-12-06T10:19:47Z|00175|binding|INFO|Setting lport e22f9593-ab8e-4691-b7fa-014aa44657b8 ovn-installed in OVS Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.723 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.727 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.762 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.795 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.827 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:47 localhost nova_compute[237281]: 2025-12-06 10:19:47.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:47 localhost ovn_controller[131684]: 2025-12-06T10:19:47Z|00176|binding|INFO|Setting lport e22f9593-ab8e-4691-b7fa-014aa44657b8 up in Southbound Dec 6 05:19:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:47.977 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e22f9593-ab8e-4691-b7fa-014aa44657b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:47.979 137259 INFO neutron.agent.ovn.metadata.agent [-] Port e22f9593-ab8e-4691-b7fa-014aa44657b8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:19:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:47.981 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:19:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:47.982 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4aea8c-37e0-4927-80c3-c0cb4f71823e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:48 localhost nova_compute[237281]: 2025-12-06 10:19:48.280 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:48 localhost podman[249773]: Dec 6 05:19:48 localhost podman[249773]: 2025-12-06 10:19:48.687335546 +0000 UTC m=+0.088753370 container create 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:19:48 localhost systemd[1]: Started libpod-conmon-08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b.scope. Dec 6 05:19:48 localhost podman[249773]: 2025-12-06 10:19:48.643518393 +0000 UTC m=+0.044936247 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:19:48 localhost systemd[1]: Started libcrun container. Dec 6 05:19:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92e840dcec95b7706d50d2df303631933dc1da21d06730bb13ba4a9f0e9e629f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:19:48 localhost podman[249773]: 2025-12-06 10:19:48.766859459 +0000 UTC m=+0.168277373 container init 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:19:48 localhost podman[249773]: 2025-12-06 10:19:48.775945629 +0000 UTC m=+0.177363463 container start 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 05:19:48 localhost dnsmasq[249791]: started, version 2.85 cachesize 150 Dec 6 05:19:48 localhost dnsmasq[249791]: DNS service limited to local subnets Dec 6 05:19:48 localhost dnsmasq[249791]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:19:48 localhost dnsmasq[249791]: warning: no upstream servers configured Dec 6 05:19:48 localhost dnsmasq-dhcp[249791]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:19:48 localhost dnsmasq[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:19:48 localhost dnsmasq-dhcp[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:19:48 localhost dnsmasq-dhcp[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:19:48 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:48.833 219384 INFO neutron.agent.dhcp.agent [None req-bfa20659-c2f4-48c2-9a0a-be07c63b3b22 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:47Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a7287883-890b-41c1-9bcb-17fa73f19345, ip_allocation=immediate, mac_address=fa:16:3e:a0:13:8f, name=tempest-NetworksTestDHCPv6-2019140221, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=6, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['eca91037-a58e-49a6-98c0-bc69f07e7bea'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:19:45Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1052, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:19:47Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:19:48 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:48.990 219384 INFO neutron.agent.dhcp.agent [None req-8f87aae6-2f99-4383-a9dc-bf20b8750cbf - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:19:49 localhost podman[249811]: 2025-12-06 10:19:49.033382241 +0000 UTC m=+0.064099349 container kill 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:19:49 localhost dnsmasq[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:19:49 localhost dnsmasq-dhcp[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:19:49 localhost dnsmasq-dhcp[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:19:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:19:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:49.765 219384 INFO neutron.agent.dhcp.agent [None req-bdb2ca9d-697b-4a71-8a15-73ba5d0fa302 - - - - - -] DHCP configuration for ports {'a7287883-890b-41c1-9bcb-17fa73f19345'} is completed#033[00m Dec 6 05:19:49 localhost podman[249832]: 2025-12-06 10:19:49.799756004 +0000 UTC m=+0.082245789 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:19:49 localhost podman[249832]: 2025-12-06 10:19:49.832354939 +0000 UTC m=+0.114844684 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:19:49 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:19:49 localhost nova_compute[237281]: 2025-12-06 10:19:49.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:50 localhost neutron_sriov_agent[212548]: 2025-12-06 10:19:50.092 2 INFO neutron.agent.securitygroups_rpc [None req-c00a1113-c1bd-401d-91bd-afa7db508da3 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:19:50 localhost dnsmasq[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:19:50 localhost podman[249872]: 2025-12-06 10:19:50.300162972 +0000 UTC m=+0.061180729 container kill 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:50 localhost dnsmasq-dhcp[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:19:50 localhost dnsmasq-dhcp[249791]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:19:50 localhost nova_compute[237281]: 2025-12-06 10:19:50.577 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:50 localhost nova_compute[237281]: 2025-12-06 10:19:50.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:50 localhost nova_compute[237281]: 2025-12-06 10:19:50.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:19:52 localhost dnsmasq[249791]: exiting on receipt of SIGTERM Dec 6 05:19:52 localhost podman[249911]: 2025-12-06 10:19:52.539521697 +0000 UTC m=+0.058136384 container kill 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:19:52 localhost systemd[1]: libpod-08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b.scope: Deactivated successfully. Dec 6 05:19:52 localhost podman[249925]: 2025-12-06 10:19:52.607989289 +0000 UTC m=+0.055802632 container died 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:52 localhost podman[249925]: 2025-12-06 10:19:52.64399006 +0000 UTC m=+0.091803353 container cleanup 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:19:52 localhost systemd[1]: libpod-conmon-08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b.scope: Deactivated successfully. Dec 6 05:19:52 localhost podman[249927]: 2025-12-06 10:19:52.689688559 +0000 UTC m=+0.129969010 container remove 08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:19:52 localhost ovn_controller[131684]: 2025-12-06T10:19:52Z|00177|binding|INFO|Releasing lport e22f9593-ab8e-4691-b7fa-014aa44657b8 from this chassis (sb_readonly=0) Dec 6 05:19:52 localhost kernel: device tape22f9593-ab left promiscuous mode Dec 6 05:19:52 localhost ovn_controller[131684]: 2025-12-06T10:19:52Z|00178|binding|INFO|Setting lport e22f9593-ab8e-4691-b7fa-014aa44657b8 down in Southbound Dec 6 05:19:52 localhost nova_compute[237281]: 2025-12-06 10:19:52.703 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:52.717 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e22f9593-ab8e-4691-b7fa-014aa44657b8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:19:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:52.719 137259 INFO neutron.agent.ovn.metadata.agent [-] Port e22f9593-ab8e-4691-b7fa-014aa44657b8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:19:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:52.719 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:19:52 localhost ovn_metadata_agent[137254]: 2025-12-06 10:19:52.720 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a9fa4fb8-e9b3-4d3f-9a16-57b11593720d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:19:52 localhost nova_compute[237281]: 2025-12-06 10:19:52.722 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:52 localhost nova_compute[237281]: 2025-12-06 10:19:52.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.191 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.192 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.193 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.194 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.282 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.294 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:53 localhost podman[197801]: time="2025-12-06T10:19:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:19:53 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:19:53.309 219384 INFO neutron.agent.dhcp.agent [None req-08385afb-d94e-4ba4-b6b8-55de33e86518 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:19:53 localhost podman[197801]: @ - - [06/Dec/2025:10:19:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:19:53 localhost podman[197801]: @ - - [06/Dec/2025:10:19:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15967 "" "Go-http-client/1.1" Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.373 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.374 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.450 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.452 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.509 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.511 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:19:53 localhost systemd[1]: var-lib-containers-storage-overlay-92e840dcec95b7706d50d2df303631933dc1da21d06730bb13ba4a9f0e9e629f-merged.mount: Deactivated successfully. Dec 6 05:19:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08ef1f856bdd60b4a3c0aa41ddef63cf6188539c7573804f5bd7b478761a041b-userdata-shm.mount: Deactivated successfully. Dec 6 05:19:53 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.582 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.830 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.832 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12309MB free_disk=387.2665901184082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.833 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:19:53 localhost nova_compute[237281]: 2025-12-06 10:19:53.833 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:19:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1756 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=609209820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB191A0000000001030307) Dec 6 05:19:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1757 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=609209820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB1D080000000001030307) Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.044 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.044 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.045 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.179 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.197 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.227 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.228 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:19:55 localhost nova_compute[237281]: 2025-12-06 10:19:55.580 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7935 DF PROTO=TCP SPT=51758 DPT=9102 SEQ=3328776850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB1F880000000001030307) Dec 6 05:19:56 localhost nova_compute[237281]: 2025-12-06 10:19:56.230 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:19:56 localhost nova_compute[237281]: 2025-12-06 10:19:56.230 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:19:56 localhost nova_compute[237281]: 2025-12-06 10:19:56.231 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:19:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1758 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=609209820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB25070000000001030307) Dec 6 05:19:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26314 DF PROTO=TCP SPT=41586 DPT=9102 SEQ=2662004881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB29870000000001030307) Dec 6 05:19:58 localhost nova_compute[237281]: 2025-12-06 10:19:58.284 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:19:58 localhost nova_compute[237281]: 2025-12-06 10:19:58.811 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:19:58 localhost nova_compute[237281]: 2025-12-06 10:19:58.812 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:19:58 localhost nova_compute[237281]: 2025-12-06 10:19:58.812 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:19:58 localhost nova_compute[237281]: 2025-12-06 10:19:58.813 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:19:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:19:59 localhost podman[249968]: 2025-12-06 10:19:59.550636653 +0000 UTC m=+0.083168047 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller) Dec 6 05:19:59 localhost podman[249968]: 2025-12-06 10:19:59.624939645 +0000 UTC m=+0.157471009 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible) Dec 6 05:19:59 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:20:00 localhost nova_compute[237281]: 2025-12-06 10:20:00.583 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1759 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=609209820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB34C70000000001030307) Dec 6 05:20:03 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:03.030 2 INFO neutron.agent.securitygroups_rpc [None req-e1140fd8-0d1d-41f0-89e6-19dc570426dd a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:03 localhost nova_compute[237281]: 2025-12-06 10:20:03.286 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:03.356 219384 INFO neutron.agent.linux.ip_lib [None req-f4d4ad91-98e0-4ab2-9870-03ad30c4f768 - - - - - -] Device tap7a74c0c8-b8 cannot be used as it has no MAC address#033[00m Dec 6 05:20:03 localhost nova_compute[237281]: 2025-12-06 10:20:03.380 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:03 localhost kernel: device tap7a74c0c8-b8 entered promiscuous mode Dec 6 05:20:03 localhost NetworkManager[5965]: [1765016403.3883] manager: (tap7a74c0c8-b8): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Dec 6 05:20:03 localhost nova_compute[237281]: 2025-12-06 10:20:03.390 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:03 localhost ovn_controller[131684]: 2025-12-06T10:20:03Z|00179|binding|INFO|Claiming lport 7a74c0c8-b8a0-4350-9371-5112192162b4 for this chassis. Dec 6 05:20:03 localhost ovn_controller[131684]: 2025-12-06T10:20:03Z|00180|binding|INFO|7a74c0c8-b8a0-4350-9371-5112192162b4: Claiming unknown Dec 6 05:20:03 localhost systemd-udevd[250004]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:20:03 localhost ovn_controller[131684]: 2025-12-06T10:20:03Z|00181|binding|INFO|Setting lport 7a74c0c8-b8a0-4350-9371-5112192162b4 ovn-installed in OVS Dec 6 05:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:20:03 localhost nova_compute[237281]: 2025-12-06 10:20:03.411 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost ovn_controller[131684]: 2025-12-06T10:20:03Z|00182|binding|INFO|Setting lport 7a74c0c8-b8a0-4350-9371-5112192162b4 up in Southbound Dec 6 05:20:03 localhost nova_compute[237281]: 2025-12-06 10:20:03.426 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:03.429 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7a74c0c8-b8a0-4350-9371-5112192162b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:03.431 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 7a74c0c8-b8a0-4350-9371-5112192162b4 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:20:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:03.432 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:03.435 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[caf146a0-4625-426b-ac0d-2be2981cae16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost journal[186952]: ethtool ioctl error on tap7a74c0c8-b8: No such device Dec 6 05:20:03 localhost nova_compute[237281]: 2025-12-06 10:20:03.476 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:03 localhost nova_compute[237281]: 2025-12-06 10:20:03.515 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:03 localhost podman[250006]: 2025-12-06 10:20:03.520209466 +0000 UTC m=+0.105852656 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:20:03 localhost podman[250006]: 2025-12-06 10:20:03.53036789 +0000 UTC m=+0.116011090 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:20:03 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:20:03 localhost systemd[1]: tmp-crun.RoH2tC.mount: Deactivated successfully. Dec 6 05:20:03 localhost podman[250008]: 2025-12-06 10:20:03.636456493 +0000 UTC m=+0.217217683 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:20:03 localhost podman[250008]: 2025-12-06 10:20:03.67721412 +0000 UTC m=+0.257975390 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_id=multipathd) Dec 6 05:20:03 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:20:04 localhost podman[250117]: Dec 6 05:20:04 localhost podman[250117]: 2025-12-06 10:20:04.409174962 +0000 UTC m=+0.098022635 container create f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:04 localhost systemd[1]: Started libpod-conmon-f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9.scope. Dec 6 05:20:04 localhost podman[250117]: 2025-12-06 10:20:04.361545183 +0000 UTC m=+0.050392926 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:20:04 localhost systemd[1]: Started libcrun container. Dec 6 05:20:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1bf0a5fa31c5580491edeccfd544bb728919e916d3d98b2d41c22f234b8a1b2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:20:04 localhost podman[250117]: 2025-12-06 10:20:04.48819976 +0000 UTC m=+0.177047443 container init f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:20:04 localhost podman[250117]: 2025-12-06 10:20:04.497333871 +0000 UTC m=+0.186181534 container start f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:20:04 localhost dnsmasq[250136]: started, version 2.85 cachesize 150 Dec 6 05:20:04 localhost dnsmasq[250136]: DNS service limited to local subnets Dec 6 05:20:04 localhost dnsmasq[250136]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:20:04 localhost dnsmasq[250136]: warning: no upstream servers configured Dec 6 05:20:04 localhost dnsmasq-dhcp[250136]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:20:04 localhost dnsmasq[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:04 localhost dnsmasq-dhcp[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:04 localhost dnsmasq-dhcp[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:04 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:04.575 219384 INFO neutron.agent.dhcp.agent [None req-f4d4ad91-98e0-4ab2-9870-03ad30c4f768 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:02Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8520f2e5-081b-47b8-af3e-be3bacf6bbd5, ip_allocation=immediate, mac_address=fa:16:3e:65:b1:7e, name=tempest-NetworksTestDHCPv6-1579947474, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['e51c22fb-4ef6-4f47-b640-9cc13fc068e9'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:00Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1060, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:02Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:04 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:04.769 219384 INFO neutron.agent.dhcp.agent [None req-88d60335-4263-4109-bc30-6f25dd61775c - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:20:04 localhost dnsmasq[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:04 localhost dnsmasq-dhcp[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:04 localhost podman[250155]: 2025-12-06 10:20:04.782798458 +0000 UTC m=+0.063663065 container kill f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:20:04 localhost dnsmasq-dhcp[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:05 localhost nova_compute[237281]: 2025-12-06 10:20:05.626 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:06.704 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:20:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:06.705 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:20:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:06.705 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:20:07 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:07.035 219384 INFO neutron.agent.dhcp.agent [None req-d0bfe849-9c54-4f1a-b3bb-a7cc72ae12d2 - - - - - -] DHCP configuration for ports {'8520f2e5-081b-47b8-af3e-be3bacf6bbd5'} is completed#033[00m Dec 6 05:20:08 localhost nova_compute[237281]: 2025-12-06 10:20:08.288 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:08 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:08.329 2 INFO neutron.agent.securitygroups_rpc [None req-4c2ab69c-991b-4fc9-91aa-58c5597b2218 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:08 localhost nova_compute[237281]: 2025-12-06 10:20:08.456 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:08.458 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:08.459 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:20:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:20:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:20:08 localhost systemd[1]: tmp-crun.3yHPEY.mount: Deactivated successfully. Dec 6 05:20:08 localhost podman[250178]: 2025-12-06 10:20:08.577008051 +0000 UTC m=+0.104926317 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:20:08 localhost podman[250178]: 2025-12-06 10:20:08.584294377 +0000 UTC m=+0.112212653 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:08 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:20:08 localhost dnsmasq[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:08 localhost dnsmasq-dhcp[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:08 localhost podman[250222]: 2025-12-06 10:20:08.739766933 +0000 UTC m=+0.063969804 container kill f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:08 localhost dnsmasq-dhcp[250136]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:08 localhost podman[250180]: 2025-12-06 10:20:08.730060294 +0000 UTC m=+0.254912166 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Dec 6 05:20:08 localhost podman[250180]: 2025-12-06 10:20:08.81454406 +0000 UTC m=+0.339395912 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute) Dec 6 05:20:08 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1760 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=609209820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB55870000000001030307) Dec 6 05:20:10 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:10.461 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:20:10 localhost nova_compute[237281]: 2025-12-06 10:20:10.656 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:11 localhost nova_compute[237281]: 2025-12-06 10:20:11.093 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:20:11 localhost nova_compute[237281]: 2025-12-06 10:20:11.139 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:20:11 localhost nova_compute[237281]: 2025-12-06 10:20:11.140 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:20:11 localhost nova_compute[237281]: 2025-12-06 10:20:11.140 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:11 localhost dnsmasq[250136]: exiting on receipt of SIGTERM Dec 6 05:20:11 localhost podman[250270]: 2025-12-06 10:20:11.460989315 +0000 UTC m=+0.059715143 container kill f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:20:11 localhost systemd[1]: libpod-f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9.scope: Deactivated successfully. Dec 6 05:20:11 localhost podman[250283]: 2025-12-06 10:20:11.52855063 +0000 UTC m=+0.056957088 container died f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:11 localhost systemd[1]: tmp-crun.nh11ce.mount: Deactivated successfully. Dec 6 05:20:11 localhost podman[250283]: 2025-12-06 10:20:11.612946203 +0000 UTC m=+0.141352621 container cleanup f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:20:11 localhost systemd[1]: libpod-conmon-f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9.scope: Deactivated successfully. Dec 6 05:20:11 localhost podman[250290]: 2025-12-06 10:20:11.640571525 +0000 UTC m=+0.152149754 container remove f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:20:11 localhost ovn_controller[131684]: 2025-12-06T10:20:11Z|00183|binding|INFO|Releasing lport 7a74c0c8-b8a0-4350-9371-5112192162b4 from this chassis (sb_readonly=0) Dec 6 05:20:11 localhost ovn_controller[131684]: 2025-12-06T10:20:11Z|00184|binding|INFO|Setting lport 7a74c0c8-b8a0-4350-9371-5112192162b4 down in Southbound Dec 6 05:20:11 localhost nova_compute[237281]: 2025-12-06 10:20:11.654 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:11 localhost kernel: device tap7a74c0c8-b8 left promiscuous mode Dec 6 05:20:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:11.664 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7a74c0c8-b8a0-4350-9371-5112192162b4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:11.666 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 7a74c0c8-b8a0-4350-9371-5112192162b4 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:20:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:11.668 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:11.669 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b0040f06-d34b-4843-be20-cd83d2bb811d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:11 localhost nova_compute[237281]: 2025-12-06 10:20:11.718 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:12.031 219384 INFO neutron.agent.dhcp.agent [None req-b9a17910-651d-4ab8-999a-d872cd3fdc7f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:20:12 localhost systemd[1]: var-lib-containers-storage-overlay-1bf0a5fa31c5580491edeccfd544bb728919e916d3d98b2d41c22f234b8a1b2d-merged.mount: Deactivated successfully. Dec 6 05:20:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f26e52549f1f59c6f4cc9bc610b222f8b7d1f83fdc8ce4dbcfeec835ff4a09a9-userdata-shm.mount: Deactivated successfully. Dec 6 05:20:12 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:20:13 localhost nova_compute[237281]: 2025-12-06 10:20:13.313 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:13.937 219384 INFO neutron.agent.linux.ip_lib [None req-d8ab895f-4143-4289-b62a-7af6fdbbac6a - - - - - -] Device tap2b375181-a0 cannot be used as it has no MAC address#033[00m Dec 6 05:20:13 localhost nova_compute[237281]: 2025-12-06 10:20:13.962 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:13 localhost kernel: device tap2b375181-a0 entered promiscuous mode Dec 6 05:20:13 localhost nova_compute[237281]: 2025-12-06 10:20:13.969 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:13 localhost ovn_controller[131684]: 2025-12-06T10:20:13Z|00185|binding|INFO|Claiming lport 2b375181-a08c-4ba6-a02f-ec8d20419ca0 for this chassis. Dec 6 05:20:13 localhost ovn_controller[131684]: 2025-12-06T10:20:13Z|00186|binding|INFO|2b375181-a08c-4ba6-a02f-ec8d20419ca0: Claiming unknown Dec 6 05:20:13 localhost NetworkManager[5965]: [1765016413.9701] manager: (tap2b375181-a0): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Dec 6 05:20:13 localhost systemd-udevd[250323]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:20:13 localhost ovn_controller[131684]: 2025-12-06T10:20:13Z|00187|binding|INFO|Setting lport 2b375181-a08c-4ba6-a02f-ec8d20419ca0 ovn-installed in OVS Dec 6 05:20:13 localhost ovn_controller[131684]: 2025-12-06T10:20:13Z|00188|binding|INFO|Setting lport 2b375181-a08c-4ba6-a02f-ec8d20419ca0 up in Southbound Dec 6 05:20:13 localhost nova_compute[237281]: 2025-12-06 10:20:13.981 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:13.980 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2b375181-a08c-4ba6-a02f-ec8d20419ca0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:13.984 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2b375181-a08c-4ba6-a02f-ec8d20419ca0 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:20:13 localhost nova_compute[237281]: 2025-12-06 10:20:13.985 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:13.986 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:13.987 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2f02c6f9-783b-4393-8d28-36b8a4145d63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:13 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost nova_compute[237281]: 2025-12-06 10:20:14.006 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:14 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost journal[186952]: ethtool ioctl error on tap2b375181-a0: No such device Dec 6 05:20:14 localhost nova_compute[237281]: 2025-12-06 10:20:14.050 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:14 localhost nova_compute[237281]: 2025-12-06 10:20:14.078 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:14 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:14.147 2 INFO neutron.agent.securitygroups_rpc [None req-46f2de04-4c62-4442-8f09-1e7220141e6b a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:14 localhost podman[250394]: Dec 6 05:20:14 localhost podman[250394]: 2025-12-06 10:20:14.939014704 +0000 UTC m=+0.094091153 container create 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS) Dec 6 05:20:14 localhost systemd[1]: Started libpod-conmon-85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a.scope. Dec 6 05:20:14 localhost systemd[1]: Started libcrun container. Dec 6 05:20:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2ed36f105f856ad4fdffd30040ccb720a6c6dc63faaa2362b91dca6b7a67771/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:20:14 localhost podman[250394]: 2025-12-06 10:20:14.894960976 +0000 UTC m=+0.050037495 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:20:15 localhost podman[250394]: 2025-12-06 10:20:15.005418323 +0000 UTC m=+0.160494752 container init 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:20:15 localhost podman[250394]: 2025-12-06 10:20:15.012738489 +0000 UTC m=+0.167814908 container start 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:20:15 localhost dnsmasq[250412]: started, version 2.85 cachesize 150 Dec 6 05:20:15 localhost dnsmasq[250412]: DNS service limited to local subnets Dec 6 05:20:15 localhost dnsmasq[250412]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:20:15 localhost dnsmasq[250412]: warning: no upstream servers configured Dec 6 05:20:15 localhost dnsmasq[250412]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:15.060 219384 INFO neutron.agent.dhcp.agent [None req-d8ab895f-4143-4289-b62a-7af6fdbbac6a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:13Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cc55d0d2-bcc6-428d-bf93-0f484ac93fc6, ip_allocation=immediate, mac_address=fa:16:3e:17:05:bb, name=tempest-NetworksTestDHCPv6-683628897, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['d4be7343-fd6e-43cf-96ac-77d8242f284a'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:12Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1078, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:13Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:15.137 219384 INFO neutron.agent.dhcp.agent [None req-204aa483-1c46-4c29-b63b-d1b572ab44a9 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:20:15 localhost podman[250430]: 2025-12-06 10:20:15.238554425 +0000 UTC m=+0.052809619 container kill 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:20:15 localhost dnsmasq[250412]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:15 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:15.404 2 INFO neutron.agent.securitygroups_rpc [None req-332c459c-40ad-4fe4-9c82-a281d71981b2 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:15.465 219384 INFO neutron.agent.dhcp.agent [None req-fbb97ea4-f835-4454-b0f2-51133d6a1742 - - - - - -] DHCP configuration for ports {'cc55d0d2-bcc6-428d-bf93-0f484ac93fc6'} is completed#033[00m Dec 6 05:20:15 localhost dnsmasq[250412]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:15 localhost podman[250469]: 2025-12-06 10:20:15.576608935 +0000 UTC m=+0.047361713 container kill 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:20:15 localhost nova_compute[237281]: 2025-12-06 10:20:15.659 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:15 localhost nova_compute[237281]: 2025-12-06 10:20:15.792 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:16 localhost openstack_network_exporter[199751]: ERROR 10:20:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:20:16 localhost openstack_network_exporter[199751]: ERROR 10:20:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:20:16 localhost openstack_network_exporter[199751]: ERROR 10:20:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:20:16 localhost openstack_network_exporter[199751]: ERROR 10:20:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:20:16 localhost openstack_network_exporter[199751]: Dec 6 05:20:16 localhost openstack_network_exporter[199751]: ERROR 10:20:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:20:16 localhost openstack_network_exporter[199751]: Dec 6 05:20:16 localhost dnsmasq[250412]: exiting on receipt of SIGTERM Dec 6 05:20:16 localhost podman[250507]: 2025-12-06 10:20:16.617021552 +0000 UTC m=+0.066619906 container kill 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:20:16 localhost systemd[1]: tmp-crun.H2aVd5.mount: Deactivated successfully. Dec 6 05:20:16 localhost systemd[1]: libpod-85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a.scope: Deactivated successfully. Dec 6 05:20:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:20:16 localhost podman[250520]: 2025-12-06 10:20:16.697611928 +0000 UTC m=+0.065596144 container died 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:20:16 localhost podman[250520]: 2025-12-06 10:20:16.733352881 +0000 UTC m=+0.101337057 container cleanup 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:20:16 localhost systemd[1]: libpod-conmon-85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a.scope: Deactivated successfully. Dec 6 05:20:16 localhost podman[250525]: 2025-12-06 10:20:16.776139381 +0000 UTC m=+0.134398458 container remove 85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:20:16 localhost kernel: device tap2b375181-a0 left promiscuous mode Dec 6 05:20:16 localhost ovn_controller[131684]: 2025-12-06T10:20:16Z|00189|binding|INFO|Releasing lport 2b375181-a08c-4ba6-a02f-ec8d20419ca0 from this chassis (sb_readonly=0) Dec 6 05:20:16 localhost ovn_controller[131684]: 2025-12-06T10:20:16Z|00190|binding|INFO|Setting lport 2b375181-a08c-4ba6-a02f-ec8d20419ca0 down in Southbound Dec 6 05:20:16 localhost nova_compute[237281]: 2025-12-06 10:20:16.825 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:16.840 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2b375181-a08c-4ba6-a02f-ec8d20419ca0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:16.844 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2b375181-a08c-4ba6-a02f-ec8d20419ca0 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:20:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:16.846 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:16 localhost nova_compute[237281]: 2025-12-06 10:20:16.847 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:16.847 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[22f8e8f7-9c4d-49f3-9bca-8305308400cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:16 localhost podman[250528]: 2025-12-06 10:20:16.856706596 +0000 UTC m=+0.206267614 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6) Dec 6 05:20:16 localhost podman[250528]: 2025-12-06 10:20:16.873401312 +0000 UTC m=+0.222962390 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public) Dec 6 05:20:16 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:20:16 localhost systemd[1]: var-lib-containers-storage-overlay-b2ed36f105f856ad4fdffd30040ccb720a6c6dc63faaa2362b91dca6b7a67771-merged.mount: Deactivated successfully. Dec 6 05:20:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85644b86d3f49aded93506fbbfab3baad294c2d47e224d5654036217611b418a-userdata-shm.mount: Deactivated successfully. Dec 6 05:20:17 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:20:17 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:17.098 219384 INFO neutron.agent.dhcp.agent [None req-d9dd1be4-7402-4d09-ada8-d7ec2cc947b1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:20:18 localhost nova_compute[237281]: 2025-12-06 10:20:18.316 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:18.585 219384 INFO neutron.agent.linux.ip_lib [None req-8256325a-5056-45f2-b582-f767ce2f226c - - - - - -] Device tap75447bad-f3 cannot be used as it has no MAC address#033[00m Dec 6 05:20:18 localhost nova_compute[237281]: 2025-12-06 10:20:18.611 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:18 localhost kernel: device tap75447bad-f3 entered promiscuous mode Dec 6 05:20:18 localhost NetworkManager[5965]: [1765016418.6185] manager: (tap75447bad-f3): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Dec 6 05:20:18 localhost ovn_controller[131684]: 2025-12-06T10:20:18Z|00191|binding|INFO|Claiming lport 75447bad-f3ac-46f6-aed8-814407a3906c for this chassis. Dec 6 05:20:18 localhost ovn_controller[131684]: 2025-12-06T10:20:18Z|00192|binding|INFO|75447bad-f3ac-46f6-aed8-814407a3906c: Claiming unknown Dec 6 05:20:18 localhost nova_compute[237281]: 2025-12-06 10:20:18.619 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:18 localhost systemd-udevd[250582]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:20:18 localhost ovn_controller[131684]: 2025-12-06T10:20:18Z|00193|binding|INFO|Setting lport 75447bad-f3ac-46f6-aed8-814407a3906c ovn-installed in OVS Dec 6 05:20:18 localhost nova_compute[237281]: 2025-12-06 10:20:18.629 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost nova_compute[237281]: 2025-12-06 10:20:18.652 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost journal[186952]: ethtool ioctl error on tap75447bad-f3: No such device Dec 6 05:20:18 localhost nova_compute[237281]: 2025-12-06 10:20:18.692 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:18 localhost nova_compute[237281]: 2025-12-06 10:20:18.717 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:18 localhost ovn_controller[131684]: 2025-12-06T10:20:18Z|00194|binding|INFO|Setting lport 75447bad-f3ac-46f6-aed8-814407a3906c up in Southbound Dec 6 05:20:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:18.754 2 INFO neutron.agent.securitygroups_rpc [None req-34a802de-b859-4f12-bd7b-7c6ca92e2ec4 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:18.755 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=75447bad-f3ac-46f6-aed8-814407a3906c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:18.759 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 75447bad-f3ac-46f6-aed8-814407a3906c in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:20:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:18.761 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:18.762 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9ea2c37f-eecd-4295-ba6f-5f1b9ada8292]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:19 localhost podman[250653]: Dec 6 05:20:19 localhost podman[250653]: 2025-12-06 10:20:19.545885029 +0000 UTC m=+0.074768798 container create 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:19 localhost systemd[1]: Started libpod-conmon-9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e.scope. Dec 6 05:20:19 localhost systemd[1]: tmp-crun.UWMD4K.mount: Deactivated successfully. Dec 6 05:20:19 localhost podman[250653]: 2025-12-06 10:20:19.507431862 +0000 UTC m=+0.036315691 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:20:19 localhost systemd[1]: Started libcrun container. Dec 6 05:20:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/722a0a7d4e46b8cae8eee602934ff41f334ae92586fb0549955dea93994f7785/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:20:19 localhost podman[250653]: 2025-12-06 10:20:19.63279651 +0000 UTC m=+0.161680279 container init 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:19 localhost podman[250653]: 2025-12-06 10:20:19.642630483 +0000 UTC m=+0.171514242 container start 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:19 localhost dnsmasq[250671]: started, version 2.85 cachesize 150 Dec 6 05:20:19 localhost dnsmasq[250671]: DNS service limited to local subnets Dec 6 05:20:19 localhost dnsmasq[250671]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:20:19 localhost dnsmasq[250671]: warning: no upstream servers configured Dec 6 05:20:19 localhost dnsmasq-dhcp[250671]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:20:19 localhost dnsmasq[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:19 localhost dnsmasq-dhcp[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:19 localhost dnsmasq-dhcp[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:19.709 219384 INFO neutron.agent.dhcp.agent [None req-8256325a-5056-45f2-b582-f767ce2f226c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=068b56cd-e049-4ecb-a4ab-5b4fe6b2bc9f, ip_allocation=immediate, mac_address=fa:16:3e:3f:62:7e, name=tempest-NetworksTestDHCPv6-13854274, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['22820099-425a-483b-9686-ce93ed4f86ae'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:17Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1126, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:18Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:19.869 219384 INFO neutron.agent.dhcp.agent [None req-4ed280f2-36f8-41c5-a18e-f3ca4f552870 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:20:19 localhost dnsmasq[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:19 localhost dnsmasq-dhcp[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:19 localhost dnsmasq-dhcp[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:19 localhost podman[250689]: 2025-12-06 10:20:19.910027763 +0000 UTC m=+0.058627650 container kill 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:20:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:20.233 219384 INFO neutron.agent.dhcp.agent [None req-ebf195e3-a699-4b1e-a607-40a12688497f - - - - - -] DHCP configuration for ports {'068b56cd-e049-4ecb-a4ab-5b4fe6b2bc9f'} is completed#033[00m Dec 6 05:20:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:20:20 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:20.522 2 INFO neutron.agent.securitygroups_rpc [None req-fe71ae50-5ae7-405e-ba62-fa7fb3f3e76a a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:20 localhost podman[250711]: 2025-12-06 10:20:20.569846459 +0000 UTC m=+0.097385245 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:20:20 localhost podman[250711]: 2025-12-06 10:20:20.605016724 +0000 UTC m=+0.132555440 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:20:20 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:20:20 localhost nova_compute[237281]: 2025-12-06 10:20:20.705 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:20 localhost dnsmasq[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:20 localhost dnsmasq-dhcp[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:20 localhost dnsmasq-dhcp[250671]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:20 localhost podman[250750]: 2025-12-06 10:20:20.781163508 +0000 UTC m=+0.061590613 container kill 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:21 localhost dnsmasq[250671]: exiting on receipt of SIGTERM Dec 6 05:20:21 localhost podman[250788]: 2025-12-06 10:20:21.895090594 +0000 UTC m=+0.058503876 container kill 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:20:21 localhost systemd[1]: libpod-9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e.scope: Deactivated successfully. Dec 6 05:20:21 localhost podman[250801]: 2025-12-06 10:20:21.967631071 +0000 UTC m=+0.056158794 container died 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:20:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e-userdata-shm.mount: Deactivated successfully. Dec 6 05:20:22 localhost podman[250801]: 2025-12-06 10:20:22.001161155 +0000 UTC m=+0.089688848 container cleanup 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:22 localhost systemd[1]: libpod-conmon-9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e.scope: Deactivated successfully. Dec 6 05:20:22 localhost podman[250803]: 2025-12-06 10:20:22.050776036 +0000 UTC m=+0.130891209 container remove 9612c0879d89f022ea9ef779d49546c1cd4c8f935f978059ea90830ef8de317e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:20:22 localhost kernel: device tap75447bad-f3 left promiscuous mode Dec 6 05:20:22 localhost ovn_controller[131684]: 2025-12-06T10:20:22Z|00195|binding|INFO|Releasing lport 75447bad-f3ac-46f6-aed8-814407a3906c from this chassis (sb_readonly=0) Dec 6 05:20:22 localhost nova_compute[237281]: 2025-12-06 10:20:22.093 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:22 localhost ovn_controller[131684]: 2025-12-06T10:20:22Z|00196|binding|INFO|Setting lport 75447bad-f3ac-46f6-aed8-814407a3906c down in Southbound Dec 6 05:20:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:22.103 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=75447bad-f3ac-46f6-aed8-814407a3906c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:22.106 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 75447bad-f3ac-46f6-aed8-814407a3906c in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:20:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:22.107 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:22.108 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[47fd47f0-6fa5-419a-8c49-95c40684e054]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:22 localhost nova_compute[237281]: 2025-12-06 10:20:22.117 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:22 localhost nova_compute[237281]: 2025-12-06 10:20:22.118 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:22.475 219384 INFO neutron.agent.dhcp.agent [None req-35b3c07c-0e5a-4242-b90c-eff690c943f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:20:22 localhost systemd[1]: var-lib-containers-storage-overlay-722a0a7d4e46b8cae8eee602934ff41f334ae92586fb0549955dea93994f7785-merged.mount: Deactivated successfully. Dec 6 05:20:22 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:20:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:22.994 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:20:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:22.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.056 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.057 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b76388c1-a332-43e1-a07e-08099ae53c4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:22.996375', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cb2188e-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '30837e2f675f5caf98de60f95a65e8b0b7fd111015fa044d85c1ec959d7fd867'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:22.996375', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cb224a0-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': 'f8bb87c28b37b8d5c3fe9b107751b7692be66bb35507a12ce5f37626979f64bf'}]}, 'timestamp': '2025-12-06 10:20:23.058056', '_unique_id': '91874b5cb75246a6949cc7033dcf27fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45323ba3-8396-410d-9306-d974872559e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.059882', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cb31f5e-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '3ee3b93136441b16ba55d7aaed9499f130ec7b6f9903173a984f643e74937e85'}]}, 'timestamp': '2025-12-06 10:20:23.064579', '_unique_id': '90c799635bf34d5d96428cbbe3b2522d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 18680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0b2e649-434b-4b52-b1b5-f1ed74024c4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18680000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:20:23.066565', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2cb597e8-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.290391888, 'message_signature': '68a4394135df62c7a77afde34c99bfb8cbaef8f9d39aac3537ef5c879aec41c9'}]}, 'timestamp': '2025-12-06 10:20:23.080699', '_unique_id': '97ff89544aa444ed8de48183d71c44e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.082 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ee4d307-af96-4262-9eba-213b9c06cfd3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.082244', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cb5e004-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '24224576d9ed67210888a343bfae4a97b1982e6365e75d8181c86248dca7313b'}]}, 'timestamp': '2025-12-06 10:20:23.082544', '_unique_id': '9e06b0acca91417ebab7e9e231c8470c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a625766d-e3ba-40b2-94ec-c3c3fc70ec70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.083923', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cb82242-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.29401827, 'message_signature': '6c6a9dbb7f13a1f0c4b5a5b269a1d398b5e61475d8a3468921294fe4f5504520'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.083923', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cb82dfa-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.29401827, 'message_signature': '1c69e5b42db79a906cd4e5187314393891922b8265e5f259b589804a55156099'}]}, 'timestamp': '2025-12-06 10:20:23.097629', '_unique_id': '317c1066e3b348bcbc40f95afb880fb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.099 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c675017-ed64-45ad-a5e4-2ce376cc19a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.099097', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cb87274-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '13308893faa09fa09d462d6a0adec809db63c72df6b140d26d419a21073fee57'}]}, 'timestamp': '2025-12-06 10:20:23.099400', '_unique_id': '6c44e5347a75431c81ea6f616f4f0753'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0a5670a-ac26-45a9-97b9-b01fbff50a35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.100760', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cb8b464-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': 'd30f0881a102f3cbdd29e3f21708966f6de9716c72e0b17b6c1b5fa36747ad86'}]}, 'timestamp': '2025-12-06 10:20:23.101090', '_unique_id': '762d129481344b63baadff9d6ab69846'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b69630a6-6ebf-4106-8863-688c07db88c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.102503', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cb8f7c6-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '773d17fc6dbadffbc093a735c68735af07892184881afd3f95ff60f7d5efd051'}]}, 'timestamp': '2025-12-06 10:20:23.102815', '_unique_id': '0c76f7b326e548c7b8c04a5c87ff40af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.103 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8eff332-3bcd-4384-a265-f1e111dc1f80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.104220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cb93a56-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '0acb1725d7eab760c3a47ceb4ec4a08e700a68a17a189c02ac5f1c827d3f67d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.104220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cb944f6-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '8dc3260518063e09cc387c2cea6ba9d3dd42ddf43764cf4e25c24cf22be32dfe'}]}, 'timestamp': '2025-12-06 10:20:23.104768', '_unique_id': 'e553dd112d264418a45a4f1c459145d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.105 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c3cd720-ed26-4e32-b946-938b2b1b01d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.106237', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cb98934-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.29401827, 'message_signature': '368ddd775645c9f293d852ca0dc0fe02ba982b2b3029dc36ff91a1d689c96b8f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.106237', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cb993ca-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.29401827, 'message_signature': '802112dd349a8f7272c25343a5b03bd33cd6125ebe4b1b5fa7c67ffd3387cb20'}]}, 'timestamp': '2025-12-06 10:20:23.106786', '_unique_id': '2c5004e7399a4cdcb6e59299c22c0bee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.108 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b28ffa5d-6248-4dc4-ae89-f1e2ebc984cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.108222', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cb9d68c-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '0da41cccd83654accfe3387cf933c5d2209c6c13f37d58b1cc73f255eef05dc6'}]}, 'timestamp': '2025-12-06 10:20:23.108515', '_unique_id': 'b9750e13545e40faabc52d153805806a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b493904b-71e3-41a3-83b7-9b7de0b07641', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.109884', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cba17aa-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '24864ec892181d0a8c50cd0534f13bdd8240af301701bf01fe6008c21eddcb53'}]}, 'timestamp': '2025-12-06 10:20:23.110217', '_unique_id': '7de7d54345bf4b40818103c75e66d104'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.111 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fdff4da-6c5f-4653-9004-91084592865b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.111653', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cba5c56-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '3d33eb34ef68738159bee203677f6c61003d1a30835fac6e02e3df83ffa8a51d'}]}, 'timestamp': '2025-12-06 10:20:23.111962', '_unique_id': 'a6bf6eec48354a1cb79371bb05dd3475'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a87b7eca-b337-49cc-940e-01e5f34eef76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.113296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cba9d6a-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '407eacf7659708daa94a171eaaa47045a0c3fa8ccf9126a7b85b13794e7387f9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.113296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cbaa8aa-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '64c53b9eae53a57047d46148979213b1eccb9fb1085e5800d356ff0181b265da'}]}, 'timestamp': '2025-12-06 10:20:23.113896', '_unique_id': '6b3f8caa84684eda8ae4ec215d5b3b62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d4f4207-de1f-40c5-aceb-421ec21dda65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.115292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cbaea7c-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': 'd0b373a771bff38f1eaa8258ca43ce0163e816b94558d19788b6445ef6d95b8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.115292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cbaf490-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': 'a9d1cb50ed400091911d7be77baf8c7b5fa29ed34be6a278b6178e1bee081f2b'}]}, 'timestamp': '2025-12-06 10:20:23.115816', '_unique_id': '4a87060303da4fb5ba7c4aeaa9ad9414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6721fbdb-bb2e-417b-b487-f24f4c188038', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.117182', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cbb3554-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': 'aed76a537287de356e841b1abd3106d832a4cfb71c13554c4d7504ed7b4704e6'}]}, 'timestamp': '2025-12-06 10:20:23.117512', '_unique_id': '39523554690945d7a91250a8c24d1c2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4b39569-a9a6-4a61-8f15-ab1580a0bde0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:20:23.118823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2cbb75c8-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.290391888, 'message_signature': '620b60e4f4eaaab7f0071b6d164cb182b1c52eb2a00b4ba30cc4c5fae19697fc'}]}, 'timestamp': '2025-12-06 10:20:23.119145', '_unique_id': 'd496ed8d96f343378baa3caa5558c104'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.119 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.120 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8358beea-2786-49e2-aa5c-e0614971767a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.120596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cbbba42-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '5f85f1d21f9244a6d1cb9ad75a83410ec72cfdf14b531b34ce4388184a074190'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.120596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cbbc582-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '3323e466af5c5d404bdd91af48d69a70db3b0c4022d52b41a155173756b72c2e'}]}, 'timestamp': '2025-12-06 10:20:23.121164', '_unique_id': 'c35c7b88ac704e4fbb474b197ab1fce7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03ebb0cf-c2f0-468d-9306-a901ec15a1a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:20:23.122692', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '2cbc0bbe-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.269991558, 'message_signature': '698f609e20ab531e602c87c5acb7f7ff92ff9f9e7ebaf54bc172ed4f8ec36448'}]}, 'timestamp': '2025-12-06 10:20:23.123019', '_unique_id': 'ecaf43e0b6414387afce297b572a3461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.124 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.124 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08248218-9ee8-424e-835c-104262e5d093', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.124339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cbc4bec-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': 'e8093c3a3d399d3df5d84658c03a08c8cbba837c4c483185c1549f10acec28c5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.124339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cbc5614-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.20648932, 'message_signature': '0088d3151d59f1ade4ae98e4349955456146eba61ea8465c1654f93739d11b2e'}]}, 'timestamp': '2025-12-06 10:20:23.124892', '_unique_id': 'a5b4f20990264dcbb089077b70630f8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.125 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f52f22d-f6a9-4757-98c5-42ec0634cb81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:20:23.126275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2cbc97e6-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.29401827, 'message_signature': 'd28e6fb1cccfbda4dd55bb5ccd71a6c19dc16b11887a978029853a0d571d1b41'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:20:23.126275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2cbca1a0-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12810.29401827, 'message_signature': 'f59397f738b6ef7c5e99a16a68923c8e4db113a935bb8ebadd750ae0976b3fd6'}]}, 'timestamp': '2025-12-06 10:20:23.126799', '_unique_id': '109a394c1f944d8884d8023568dfe906'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:20:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:20:23.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:20:23 localhost podman[197801]: time="2025-12-06T10:20:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:20:23 localhost podman[197801]: @ - - [06/Dec/2025:10:20:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:20:23 localhost nova_compute[237281]: 2025-12-06 10:20:23.365 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:23 localhost podman[197801]: @ - - [06/Dec/2025:10:20:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15963 "" "Go-http-client/1.1" Dec 6 05:20:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45064 DF PROTO=TCP SPT=35404 DPT=9102 SEQ=405234843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB8E4A0000000001030307) Dec 6 05:20:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45065 DF PROTO=TCP SPT=35404 DPT=9102 SEQ=405234843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB92470000000001030307) Dec 6 05:20:25 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:25.462 219384 INFO neutron.agent.linux.ip_lib [None req-19dfc035-3ffa-451e-a5b0-ed43a4558ca2 - - - - - -] Device tap81963768-05 cannot be used as it has no MAC address#033[00m Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.529 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost kernel: device tap81963768-05 entered promiscuous mode Dec 6 05:20:25 localhost ovn_controller[131684]: 2025-12-06T10:20:25Z|00197|binding|INFO|Claiming lport 81963768-052d-46f6-956e-b9d93dadc130 for this chassis. Dec 6 05:20:25 localhost ovn_controller[131684]: 2025-12-06T10:20:25Z|00198|binding|INFO|81963768-052d-46f6-956e-b9d93dadc130: Claiming unknown Dec 6 05:20:25 localhost NetworkManager[5965]: [1765016425.5372] manager: (tap81963768-05): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Dec 6 05:20:25 localhost systemd-udevd[250838]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.542 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost ovn_controller[131684]: 2025-12-06T10:20:25Z|00199|binding|INFO|Setting lport 81963768-052d-46f6-956e-b9d93dadc130 ovn-installed in OVS Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.547 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.550 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost ovn_controller[131684]: 2025-12-06T10:20:25Z|00200|binding|INFO|Setting lport 81963768-052d-46f6-956e-b9d93dadc130 up in Southbound Dec 6 05:20:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:25.556 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81963768-052d-46f6-956e-b9d93dadc130) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:25.558 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 81963768-052d-46f6-956e-b9d93dadc130 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:20:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:25.559 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:25.560 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[752273c9-678d-4b1b-b1fb-231bcaa2e217]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.580 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost journal[186952]: ethtool ioctl error on tap81963768-05: No such device Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.622 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.651 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost nova_compute[237281]: 2025-12-06 10:20:25.706 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:25 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:25.744 2 INFO neutron.agent.securitygroups_rpc [None req-d105417e-a570-4d01-a974-b4c3358ab13b a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1761 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=609209820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB95880000000001030307) Dec 6 05:20:26 localhost podman[250909]: Dec 6 05:20:26 localhost podman[250909]: 2025-12-06 10:20:26.4491986 +0000 UTC m=+0.096960992 container create 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 05:20:26 localhost systemd[1]: Started libpod-conmon-9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c.scope. Dec 6 05:20:26 localhost podman[250909]: 2025-12-06 10:20:26.398075603 +0000 UTC m=+0.045838035 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:20:26 localhost systemd[1]: Started libcrun container. Dec 6 05:20:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/574b2203cfae7ba57e326b6834f85c5a60292005767d95dd6cb37293a40aad24/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:20:26 localhost podman[250909]: 2025-12-06 10:20:26.530790477 +0000 UTC m=+0.178552869 container init 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:20:26 localhost podman[250909]: 2025-12-06 10:20:26.543874561 +0000 UTC m=+0.191637033 container start 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:20:26 localhost dnsmasq[250928]: started, version 2.85 cachesize 150 Dec 6 05:20:26 localhost dnsmasq[250928]: DNS service limited to local subnets Dec 6 05:20:26 localhost dnsmasq[250928]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:20:26 localhost dnsmasq[250928]: warning: no upstream servers configured Dec 6 05:20:26 localhost dnsmasq-dhcp[250928]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:20:26 localhost dnsmasq[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:26 localhost dnsmasq-dhcp[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:26 localhost dnsmasq-dhcp[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:26.605 219384 INFO neutron.agent.dhcp.agent [None req-19dfc035-3ffa-451e-a5b0-ed43a4558ca2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:25Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=99fa1c6d-8d85-453e-a03e-0c8ec5866cef, ip_allocation=immediate, mac_address=fa:16:3e:d6:4b:7a, name=tempest-NetworksTestDHCPv6-1187873314, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['342c7add-e8b3-4a31-b7ea-83dbb63e60d6'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:24Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1139, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:25Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:26.755 219384 INFO neutron.agent.dhcp.agent [None req-53c3f939-eb69-415b-bb76-3dbc5e8e385e - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:20:26 localhost podman[250947]: 2025-12-06 10:20:26.803731128 +0000 UTC m=+0.061960962 container kill 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 05:20:26 localhost dnsmasq[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:26 localhost dnsmasq-dhcp[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:26 localhost dnsmasq-dhcp[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:27.016 219384 INFO neutron.agent.dhcp.agent [None req-9e9f9cef-4347-45f7-8249-4e614fdcd5d1 - - - - - -] DHCP configuration for ports {'99fa1c6d-8d85-453e-a03e-0c8ec5866cef'} is completed#033[00m Dec 6 05:20:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45066 DF PROTO=TCP SPT=35404 DPT=9102 SEQ=405234843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB9A470000000001030307) Dec 6 05:20:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7936 DF PROTO=TCP SPT=51758 DPT=9102 SEQ=3328776850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDB9D880000000001030307) Dec 6 05:20:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:28.313 2 INFO neutron.agent.securitygroups_rpc [None req-e7b4d987-53be-4c78-96fb-f7ae06118b5d a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:20:28 localhost nova_compute[237281]: 2025-12-06 10:20:28.411 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:28 localhost dnsmasq[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:28 localhost dnsmasq-dhcp[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:28 localhost dnsmasq-dhcp[250928]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:28 localhost podman[250983]: 2025-12-06 10:20:28.557165743 +0000 UTC m=+0.059666663 container kill 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:29 localhost dnsmasq[250928]: exiting on receipt of SIGTERM Dec 6 05:20:29 localhost podman[251019]: 2025-12-06 10:20:29.951668983 +0000 UTC m=+0.066166902 container kill 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:29 localhost systemd[1]: libpod-9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c.scope: Deactivated successfully. Dec 6 05:20:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:20:30 localhost podman[251035]: 2025-12-06 10:20:30.02772534 +0000 UTC m=+0.051728367 container died 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:30 localhost systemd[1]: tmp-crun.hA2nq0.mount: Deactivated successfully. Dec 6 05:20:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c-userdata-shm.mount: Deactivated successfully. Dec 6 05:20:30 localhost podman[251035]: 2025-12-06 10:20:30.074788431 +0000 UTC m=+0.098791488 container remove 9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:20:30 localhost nova_compute[237281]: 2025-12-06 10:20:30.124 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:30 localhost ovn_controller[131684]: 2025-12-06T10:20:30Z|00201|binding|INFO|Releasing lport 81963768-052d-46f6-956e-b9d93dadc130 from this chassis (sb_readonly=0) Dec 6 05:20:30 localhost ovn_controller[131684]: 2025-12-06T10:20:30Z|00202|binding|INFO|Setting lport 81963768-052d-46f6-956e-b9d93dadc130 down in Southbound Dec 6 05:20:30 localhost kernel: device tap81963768-05 left promiscuous mode Dec 6 05:20:30 localhost nova_compute[237281]: 2025-12-06 10:20:30.133 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:30 localhost systemd[1]: libpod-conmon-9d35f5f7d21cd80723ce636caf7dae5683060072fe6d88edb87ae2dd66f3207c.scope: Deactivated successfully. Dec 6 05:20:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:30.135 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81963768-052d-46f6-956e-b9d93dadc130) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:30.137 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 81963768-052d-46f6-956e-b9d93dadc130 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:20:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:30.139 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:30.140 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b6113e5d-2742-4a15-a8d8-edeff8f2ba65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:30 localhost podman[251040]: 2025-12-06 10:20:30.144188873 +0000 UTC m=+0.160839213 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller) Dec 6 05:20:30 localhost nova_compute[237281]: 2025-12-06 10:20:30.147 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:30 localhost podman[251040]: 2025-12-06 10:20:30.184745334 +0000 UTC m=+0.201395654 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:30 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:20:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:30.534 219384 INFO neutron.agent.dhcp.agent [None req-d584e02c-b5fe-4630-97ab-e538b5a06a75 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:20:30 localhost nova_compute[237281]: 2025-12-06 10:20:30.709 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:30 localhost systemd[1]: var-lib-containers-storage-overlay-574b2203cfae7ba57e326b6834f85c5a60292005767d95dd6cb37293a40aad24-merged.mount: Deactivated successfully. Dec 6 05:20:30 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:20:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45067 DF PROTO=TCP SPT=35404 DPT=9102 SEQ=405234843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDBAA080000000001030307) Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.291 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:32.441 219384 INFO neutron.agent.linux.ip_lib [None req-3c66160e-8271-49b4-87f4-cf7d3bc95129 - - - - - -] Device tapeb1a2ef9-44 cannot be used as it has no MAC address#033[00m Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.467 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:32 localhost kernel: device tapeb1a2ef9-44 entered promiscuous mode Dec 6 05:20:32 localhost NetworkManager[5965]: [1765016432.4760] manager: (tapeb1a2ef9-44): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Dec 6 05:20:32 localhost ovn_controller[131684]: 2025-12-06T10:20:32Z|00203|binding|INFO|Claiming lport eb1a2ef9-449d-4683-9fd1-158cd31219f0 for this chassis. Dec 6 05:20:32 localhost ovn_controller[131684]: 2025-12-06T10:20:32Z|00204|binding|INFO|eb1a2ef9-449d-4683-9fd1-158cd31219f0: Claiming unknown Dec 6 05:20:32 localhost systemd-udevd[251094]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.478 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:32 localhost ovn_controller[131684]: 2025-12-06T10:20:32Z|00205|binding|INFO|Setting lport eb1a2ef9-449d-4683-9fd1-158cd31219f0 ovn-installed in OVS Dec 6 05:20:32 localhost ovn_controller[131684]: 2025-12-06T10:20:32Z|00206|binding|INFO|Setting lport eb1a2ef9-449d-4683-9fd1-158cd31219f0 up in Southbound Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.487 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:32.488 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb1a2ef9-449d-4683-9fd1-158cd31219f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.490 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:32.491 137259 INFO neutron.agent.ovn.metadata.agent [-] Port eb1a2ef9-449d-4683-9fd1-158cd31219f0 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:20:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:32.493 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:32.494 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d3e83d-6043-43c8-a36a-ea5717b16db8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.512 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost journal[186952]: ethtool ioctl error on tapeb1a2ef9-44: No such device Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.544 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:32 localhost nova_compute[237281]: 2025-12-06 10:20:32.569 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:33 localhost podman[251166]: Dec 6 05:20:33 localhost podman[251166]: 2025-12-06 10:20:33.353255203 +0000 UTC m=+0.082655131 container create d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:20:33 localhost systemd[1]: Started libpod-conmon-d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6.scope. Dec 6 05:20:33 localhost podman[251166]: 2025-12-06 10:20:33.307976686 +0000 UTC m=+0.037376604 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:20:33 localhost systemd[1]: Started libcrun container. Dec 6 05:20:33 localhost nova_compute[237281]: 2025-12-06 10:20:33.412 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7118c2c865c843008cc0773e41ad4c6ef9af2a68978e5cfa88c06efc3cdadd8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:20:33 localhost podman[251166]: 2025-12-06 10:20:33.428105302 +0000 UTC m=+0.157505190 container init d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:20:33 localhost podman[251166]: 2025-12-06 10:20:33.4390556 +0000 UTC m=+0.168455488 container start d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:20:33 localhost dnsmasq[251184]: started, version 2.85 cachesize 150 Dec 6 05:20:33 localhost dnsmasq[251184]: DNS service limited to local subnets Dec 6 05:20:33 localhost dnsmasq[251184]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:20:33 localhost dnsmasq[251184]: warning: no upstream servers configured Dec 6 05:20:33 localhost dnsmasq-dhcp[251184]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:20:33 localhost dnsmasq[251184]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:33 localhost dnsmasq-dhcp[251184]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:33 localhost dnsmasq-dhcp[251184]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:33 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:33.607 219384 INFO neutron.agent.dhcp.agent [None req-8af4cb4c-8859-43b5-bf4b-09dfd1ba9e8a - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:20:34 localhost podman[251185]: 2025-12-06 10:20:34.314045784 +0000 UTC m=+0.096701985 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:20:34 localhost podman[251186]: 2025-12-06 10:20:34.360557978 +0000 UTC m=+0.143205998 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:20:34 localhost podman[251186]: 2025-12-06 10:20:34.374414276 +0000 UTC m=+0.157062296 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:20:34 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:20:34 localhost podman[251185]: 2025-12-06 10:20:34.424939375 +0000 UTC m=+0.207595576 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:20:34 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:20:34 localhost podman[251247]: 2025-12-06 10:20:34.938119356 +0000 UTC m=+0.059582598 container kill d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:20:34 localhost dnsmasq[251184]: exiting on receipt of SIGTERM Dec 6 05:20:34 localhost systemd[1]: libpod-d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6.scope: Deactivated successfully. Dec 6 05:20:35 localhost podman[251259]: 2025-12-06 10:20:35.007226949 +0000 UTC m=+0.056622808 container died d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:20:35 localhost podman[251259]: 2025-12-06 10:20:35.03674583 +0000 UTC m=+0.086141689 container cleanup d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:20:35 localhost systemd[1]: libpod-conmon-d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6.scope: Deactivated successfully. Dec 6 05:20:35 localhost podman[251261]: 2025-12-06 10:20:35.09412398 +0000 UTC m=+0.128352111 container remove d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:20:35 localhost ovn_controller[131684]: 2025-12-06T10:20:35Z|00207|binding|INFO|Releasing lport eb1a2ef9-449d-4683-9fd1-158cd31219f0 from this chassis (sb_readonly=0) Dec 6 05:20:35 localhost ovn_controller[131684]: 2025-12-06T10:20:35Z|00208|binding|INFO|Setting lport eb1a2ef9-449d-4683-9fd1-158cd31219f0 down in Southbound Dec 6 05:20:35 localhost kernel: device tapeb1a2ef9-44 left promiscuous mode Dec 6 05:20:35 localhost nova_compute[237281]: 2025-12-06 10:20:35.153 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:35.165 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb1a2ef9-449d-4683-9fd1-158cd31219f0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:35.167 137259 INFO neutron.agent.ovn.metadata.agent [-] Port eb1a2ef9-449d-4683-9fd1-158cd31219f0 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:20:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:35.168 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:35.169 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e8e661-292f-4302-be21-79186a2eaa3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:35 localhost nova_compute[237281]: 2025-12-06 10:20:35.176 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:35 localhost systemd[1]: var-lib-containers-storage-overlay-7118c2c865c843008cc0773e41ad4c6ef9af2a68978e5cfa88c06efc3cdadd8e-merged.mount: Deactivated successfully. Dec 6 05:20:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3ff926b40324cd2f41cd5242ddbb0596dbf4d91754b0d48f0640736d0b933a6-userdata-shm.mount: Deactivated successfully. Dec 6 05:20:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:35.690 219384 INFO neutron.agent.dhcp.agent [None req-c866e0a7-8a75-4b71-9e3b-f1c9a7b04658 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:20:35 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:20:35 localhost nova_compute[237281]: 2025-12-06 10:20:35.712 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.081 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:37.485 219384 INFO neutron.agent.linux.ip_lib [None req-3251aa56-79f8-4a6d-abfe-efad12939db5 - - - - - -] Device tapfe62dcac-e1 cannot be used as it has no MAC address#033[00m Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.550 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost kernel: device tapfe62dcac-e1 entered promiscuous mode Dec 6 05:20:37 localhost NetworkManager[5965]: [1765016437.5604] manager: (tapfe62dcac-e1): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Dec 6 05:20:37 localhost ovn_controller[131684]: 2025-12-06T10:20:37Z|00209|binding|INFO|Claiming lport fe62dcac-e165-4e15-aaae-61d669e5540e for this chassis. Dec 6 05:20:37 localhost ovn_controller[131684]: 2025-12-06T10:20:37Z|00210|binding|INFO|fe62dcac-e165-4e15-aaae-61d669e5540e: Claiming unknown Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.561 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost systemd-udevd[251299]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.568 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost ovn_controller[131684]: 2025-12-06T10:20:37Z|00211|binding|INFO|Setting lport fe62dcac-e165-4e15-aaae-61d669e5540e ovn-installed in OVS Dec 6 05:20:37 localhost ovn_controller[131684]: 2025-12-06T10:20:37Z|00212|binding|INFO|Setting lport fe62dcac-e165-4e15-aaae-61d669e5540e up in Southbound Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.570 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:37.571 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fe62dcac-e165-4e15-aaae-61d669e5540e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.572 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:37.573 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fe62dcac-e165-4e15-aaae-61d669e5540e in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:20:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:37.574 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:37.574 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[45b61cc1-eaf2-42a3-a9d4-11309cb1d1b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.597 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost journal[186952]: ethtool ioctl error on tapfe62dcac-e1: No such device Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.635 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:37 localhost nova_compute[237281]: 2025-12-06 10:20:37.658 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:38 localhost nova_compute[237281]: 2025-12-06 10:20:38.414 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:38 localhost podman[251370]: Dec 6 05:20:38 localhost podman[251370]: 2025-12-06 10:20:38.467897911 +0000 UTC m=+0.092321539 container create f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:20:38 localhost systemd[1]: Started libpod-conmon-f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb.scope. Dec 6 05:20:38 localhost systemd[1]: tmp-crun.si2mIt.mount: Deactivated successfully. Dec 6 05:20:38 localhost podman[251370]: 2025-12-06 10:20:38.423062018 +0000 UTC m=+0.047485626 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:20:38 localhost systemd[1]: Started libcrun container. Dec 6 05:20:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67fdb2894bac801e95a0164c02b241b2904873d6967052101279c35fefb0cc54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:20:38 localhost podman[251370]: 2025-12-06 10:20:38.539616254 +0000 UTC m=+0.164039862 container init f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:20:38 localhost podman[251370]: 2025-12-06 10:20:38.548525539 +0000 UTC m=+0.172949157 container start f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:38 localhost dnsmasq[251389]: started, version 2.85 cachesize 150 Dec 6 05:20:38 localhost dnsmasq[251389]: DNS service limited to local subnets Dec 6 05:20:38 localhost dnsmasq[251389]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:20:38 localhost dnsmasq[251389]: warning: no upstream servers configured Dec 6 05:20:38 localhost dnsmasq-dhcp[251389]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:20:38 localhost dnsmasq[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:38 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:38 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:38.622 219384 INFO neutron.agent.dhcp.agent [None req-3251aa56-79f8-4a6d-abfe-efad12939db5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:38Z, description=, device_id=f611570e-7502-454c-8334-ff71ceb67d41, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ed52a7d2-9bf2-4718-bf36-4658ae24be73, ip_allocation=immediate, mac_address=fa:16:3e:2e:58:43, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['5dbe6888-a661-4de3-ab65-c00e1f4e1ef3'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:36Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=False, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1186, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:38Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:38.774 219384 INFO neutron.agent.dhcp.agent [None req-607f7fc4-a490-4ab5-a846-6fd055013f5a - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:20:38 localhost dnsmasq[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:38 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:38 localhost podman[251406]: 2025-12-06 10:20:38.809391877 +0000 UTC m=+0.063595743 container kill f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:20:38 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:20:38 localhost podman[251420]: 2025-12-06 10:20:38.925794258 +0000 UTC m=+0.089024678 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:20:38 localhost podman[251420]: 2025-12-06 10:20:38.954754311 +0000 UTC m=+0.117984721 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Dec 6 05:20:38 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:20:39 localhost podman[251445]: 2025-12-06 10:20:39.048393721 +0000 UTC m=+0.084857530 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:20:39 localhost podman[251445]: 2025-12-06 10:20:39.060630648 +0000 UTC m=+0.097094477 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:39 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45068 DF PROTO=TCP SPT=35404 DPT=9102 SEQ=405234843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDBC9870000000001030307) Dec 6 05:20:39 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:39.300 219384 INFO neutron.agent.dhcp.agent [None req-8d2dcdec-7686-4551-afd8-adcac676f900 - - - - - -] DHCP configuration for ports {'ed52a7d2-9bf2-4718-bf36-4658ae24be73'} is completed#033[00m Dec 6 05:20:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:40.135 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:38Z, description=, device_id=f611570e-7502-454c-8334-ff71ceb67d41, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ed52a7d2-9bf2-4718-bf36-4658ae24be73, ip_allocation=immediate, mac_address=fa:16:3e:2e:58:43, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['5dbe6888-a661-4de3-ab65-c00e1f4e1ef3'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:36Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=False, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1186, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:38Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:40 localhost dnsmasq[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:40 localhost podman[251481]: 2025-12-06 10:20:40.331952088 +0000 UTC m=+0.061484157 container kill f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:20:40 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:40 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:40 localhost nova_compute[237281]: 2025-12-06 10:20:40.399 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:40.594 219384 INFO neutron.agent.dhcp.agent [None req-ec0b45d5-049e-42cb-af1f-824cfb0f8a3e - - - - - -] DHCP configuration for ports {'ed52a7d2-9bf2-4718-bf36-4658ae24be73'} is completed#033[00m Dec 6 05:20:40 localhost nova_compute[237281]: 2025-12-06 10:20:40.715 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:42 localhost nova_compute[237281]: 2025-12-06 10:20:42.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:43 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:43.361 2 INFO neutron.agent.securitygroups_rpc [None req-f7582a6f-c5bf-4f7b-9650-804f9ad8352c 31b90aa8a6d5449db6ebebe8231dba07 a62b4b216e144e1bab883788cb7a7a11 - - default default] Security group member updated ['6b1ee0ca-85df-403b-a559-f6353d6d9742']#033[00m Dec 6 05:20:43 localhost nova_compute[237281]: 2025-12-06 10:20:43.417 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:43 localhost dnsmasq[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:43 localhost podman[251518]: 2025-12-06 10:20:43.753135544 +0000 UTC m=+0.059099514 container kill f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:20:43 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:20:43 localhost dnsmasq-dhcp[251389]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:20:43 localhost ovn_controller[131684]: 2025-12-06T10:20:43Z|00213|binding|INFO|Releasing lport fe62dcac-e165-4e15-aaae-61d669e5540e from this chassis (sb_readonly=0) Dec 6 05:20:43 localhost kernel: device tapfe62dcac-e1 left promiscuous mode Dec 6 05:20:43 localhost nova_compute[237281]: 2025-12-06 10:20:43.970 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:43 localhost ovn_controller[131684]: 2025-12-06T10:20:43Z|00214|binding|INFO|Setting lport fe62dcac-e165-4e15-aaae-61d669e5540e down in Southbound Dec 6 05:20:44 localhost nova_compute[237281]: 2025-12-06 10:20:44.003 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:44.048 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fe62dcac-e165-4e15-aaae-61d669e5540e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:44.050 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fe62dcac-e165-4e15-aaae-61d669e5540e in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:20:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:44.052 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:44.053 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[14c34938-7427-4293-bc14-8b869c64f336]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:44 localhost neutron_sriov_agent[212548]: 2025-12-06 10:20:44.535 2 INFO neutron.agent.securitygroups_rpc [None req-aa6fed3a-f7de-40eb-9f93-042e6aefb4b0 31b90aa8a6d5449db6ebebe8231dba07 a62b4b216e144e1bab883788cb7a7a11 - - default default] Security group member updated ['6b1ee0ca-85df-403b-a559-f6353d6d9742']#033[00m Dec 6 05:20:45 localhost nova_compute[237281]: 2025-12-06 10:20:45.718 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:45 localhost dnsmasq[251389]: exiting on receipt of SIGTERM Dec 6 05:20:45 localhost podman[251558]: 2025-12-06 10:20:45.883185437 +0000 UTC m=+0.062265462 container kill f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:20:45 localhost systemd[1]: libpod-f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb.scope: Deactivated successfully. Dec 6 05:20:45 localhost nova_compute[237281]: 2025-12-06 10:20:45.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:45 localhost podman[251573]: 2025-12-06 10:20:45.954358843 +0000 UTC m=+0.054790222 container died f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:20:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb-userdata-shm.mount: Deactivated successfully. Dec 6 05:20:45 localhost systemd[1]: var-lib-containers-storage-overlay-67fdb2894bac801e95a0164c02b241b2904873d6967052101279c35fefb0cc54-merged.mount: Deactivated successfully. Dec 6 05:20:45 localhost podman[251573]: 2025-12-06 10:20:45.977303571 +0000 UTC m=+0.077734920 container cleanup f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:20:45 localhost systemd[1]: libpod-conmon-f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb.scope: Deactivated successfully. Dec 6 05:20:46 localhost podman[251574]: 2025-12-06 10:20:46.019299566 +0000 UTC m=+0.114591686 container remove f00cac17d383effc93456efb18d523ba574e2e4ccb3029c547e6caa234ebf1eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:20:46 localhost openstack_network_exporter[199751]: ERROR 10:20:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:20:46 localhost openstack_network_exporter[199751]: ERROR 10:20:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:20:46 localhost openstack_network_exporter[199751]: ERROR 10:20:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:20:46 localhost openstack_network_exporter[199751]: ERROR 10:20:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:20:46 localhost openstack_network_exporter[199751]: Dec 6 05:20:46 localhost openstack_network_exporter[199751]: ERROR 10:20:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:20:46 localhost openstack_network_exporter[199751]: Dec 6 05:20:46 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:46.313 219384 INFO neutron.agent.dhcp.agent [None req-73e754e6-e9be-4286-b307-f5181e57a60d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:20:46 localhost ovn_controller[131684]: 2025-12-06T10:20:46Z|00215|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:20:46 localhost nova_compute[237281]: 2025-12-06 10:20:46.708 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:46 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:20:46 localhost nova_compute[237281]: 2025-12-06 10:20:46.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:20:47 localhost systemd[1]: tmp-crun.nSI7fW.mount: Deactivated successfully. Dec 6 05:20:47 localhost podman[251600]: 2025-12-06 10:20:47.551833306 +0000 UTC m=+0.088821022 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64) Dec 6 05:20:47 localhost podman[251600]: 2025-12-06 10:20:47.56332883 +0000 UTC m=+0.100316566 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, config_id=edpm, version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 05:20:47 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:20:48 localhost nova_compute[237281]: 2025-12-06 10:20:48.457 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:49.371 219384 INFO neutron.agent.linux.ip_lib [None req-8934c285-d237-4f54-b969-a8dc37408e6c - - - - - -] Device tapf6f13e22-c8 cannot be used as it has no MAC address#033[00m Dec 6 05:20:49 localhost nova_compute[237281]: 2025-12-06 10:20:49.396 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:49 localhost kernel: device tapf6f13e22-c8 entered promiscuous mode Dec 6 05:20:49 localhost NetworkManager[5965]: [1765016449.4033] manager: (tapf6f13e22-c8): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Dec 6 05:20:49 localhost ovn_controller[131684]: 2025-12-06T10:20:49Z|00216|binding|INFO|Claiming lport f6f13e22-c849-4b15-80a6-34b7dc55e6f6 for this chassis. Dec 6 05:20:49 localhost nova_compute[237281]: 2025-12-06 10:20:49.402 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:49 localhost systemd-udevd[251630]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:20:49 localhost ovn_controller[131684]: 2025-12-06T10:20:49Z|00217|binding|INFO|f6f13e22-c849-4b15-80a6-34b7dc55e6f6: Claiming unknown Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost ovn_controller[131684]: 2025-12-06T10:20:49Z|00218|binding|INFO|Setting lport f6f13e22-c849-4b15-80a6-34b7dc55e6f6 ovn-installed in OVS Dec 6 05:20:49 localhost nova_compute[237281]: 2025-12-06 10:20:49.447 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost ovn_controller[131684]: 2025-12-06T10:20:49Z|00219|binding|INFO|Setting lport f6f13e22-c849-4b15-80a6-34b7dc55e6f6 up in Southbound Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:49.476 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f6f13e22-c849-4b15-80a6-34b7dc55e6f6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:49.480 137259 INFO neutron.agent.ovn.metadata.agent [-] Port f6f13e22-c849-4b15-80a6-34b7dc55e6f6 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:20:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:49.483 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:49 localhost journal[186952]: ethtool ioctl error on tapf6f13e22-c8: No such device Dec 6 05:20:49 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:49.485 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[8649d490-660e-42a1-a38b-028fd42eddf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:49 localhost nova_compute[237281]: 2025-12-06 10:20:49.489 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:49 localhost nova_compute[237281]: 2025-12-06 10:20:49.560 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:49 localhost nova_compute[237281]: 2025-12-06 10:20:49.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:49 localhost nova_compute[237281]: 2025-12-06 10:20:49.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:50 localhost podman[251701]: Dec 6 05:20:50 localhost podman[251701]: 2025-12-06 10:20:50.371343879 +0000 UTC m=+0.094514127 container create 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:20:50 localhost systemd[1]: Started libpod-conmon-4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118.scope. Dec 6 05:20:50 localhost podman[251701]: 2025-12-06 10:20:50.324514324 +0000 UTC m=+0.047684612 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:20:50 localhost systemd[1]: Started libcrun container. Dec 6 05:20:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44f592a84a64837b0c68a11517c1cf42588b99472c5236f7c237c338d4a9df85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:20:50 localhost podman[251701]: 2025-12-06 10:20:50.442489453 +0000 UTC m=+0.165659701 container init 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:20:50 localhost podman[251701]: 2025-12-06 10:20:50.451943115 +0000 UTC m=+0.175113363 container start 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:20:50 localhost dnsmasq[251719]: started, version 2.85 cachesize 150 Dec 6 05:20:50 localhost dnsmasq[251719]: DNS service limited to local subnets Dec 6 05:20:50 localhost dnsmasq[251719]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:20:50 localhost dnsmasq[251719]: warning: no upstream servers configured Dec 6 05:20:50 localhost dnsmasq[251719]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:50 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:50.510 219384 INFO neutron.agent.dhcp.agent [None req-8934c285-d237-4f54-b969-a8dc37408e6c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:49Z, description=, device_id=7d068536-7d52-44cb-940e-0004d4b1d326, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=26a9cd30-9a43-44fb-a544-507fb445212e, ip_allocation=immediate, mac_address=fa:16:3e:4c:a0:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['53599b9f-0482-4148-b4ad-ca9877dc707b'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:48Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=False, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1222, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:50Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:50 localhost dnsmasq[251719]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:50 localhost podman[251738]: 2025-12-06 10:20:50.709987295 +0000 UTC m=+0.059294700 container kill 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 6 05:20:50 localhost nova_compute[237281]: 2025-12-06 10:20:50.720 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:50 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:50.739 219384 INFO neutron.agent.dhcp.agent [None req-6120fd65-38e3-4d03-a688-3ff5b2ae35fb - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:20:51 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:51.015 219384 INFO neutron.agent.dhcp.agent [None req-a504e03c-ad7f-4a84-b62c-7c08093261be - - - - - -] DHCP configuration for ports {'26a9cd30-9a43-44fb-a544-507fb445212e'} is completed#033[00m Dec 6 05:20:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:20:51 localhost podman[251760]: 2025-12-06 10:20:51.2948835 +0000 UTC m=+0.080689920 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:20:51 localhost podman[251760]: 2025-12-06 10:20:51.305208949 +0000 UTC m=+0.091015419 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:20:51 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:20:51 localhost nova_compute[237281]: 2025-12-06 10:20:51.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:51 localhost nova_compute[237281]: 2025-12-06 10:20:51.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:20:52 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:52.695 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:49Z, description=, device_id=7d068536-7d52-44cb-940e-0004d4b1d326, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=26a9cd30-9a43-44fb-a544-507fb445212e, ip_allocation=immediate, mac_address=fa:16:3e:4c:a0:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['53599b9f-0482-4148-b4ad-ca9877dc707b'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:48Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=False, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1222, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:20:50Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:20:52 localhost dnsmasq[251719]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:20:52 localhost podman[251802]: 2025-12-06 10:20:52.85657481 +0000 UTC m=+0.038474069 container kill 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:20:52 localhost systemd[1]: tmp-crun.CYzaU0.mount: Deactivated successfully. Dec 6 05:20:53 localhost nova_compute[237281]: 2025-12-06 10:20:53.124 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:53 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:53.137 219384 INFO neutron.agent.dhcp.agent [None req-b8b1a007-c17b-4d1a-abf5-9ac2a103ed72 - - - - - -] DHCP configuration for ports {'26a9cd30-9a43-44fb-a544-507fb445212e'} is completed#033[00m Dec 6 05:20:53 localhost podman[197801]: time="2025-12-06T10:20:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:20:53 localhost podman[197801]: @ - - [06/Dec/2025:10:20:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 145869 "" "Go-http-client/1.1" Dec 6 05:20:53 localhost podman[197801]: @ - - [06/Dec/2025:10:20:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16430 "" "Go-http-client/1.1" Dec 6 05:20:53 localhost nova_compute[237281]: 2025-12-06 10:20:53.459 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4883 DF PROTO=TCP SPT=36284 DPT=9102 SEQ=1379431737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC03790000000001030307) Dec 6 05:20:54 localhost nova_compute[237281]: 2025-12-06 10:20:54.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:20:54 localhost nova_compute[237281]: 2025-12-06 10:20:54.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:20:54 localhost nova_compute[237281]: 2025-12-06 10:20:54.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:20:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4884 DF PROTO=TCP SPT=36284 DPT=9102 SEQ=1379431737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC07870000000001030307) Dec 6 05:20:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45069 DF PROTO=TCP SPT=35404 DPT=9102 SEQ=405234843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC09870000000001030307) Dec 6 05:20:55 localhost nova_compute[237281]: 2025-12-06 10:20:55.701 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:20:55 localhost nova_compute[237281]: 2025-12-06 10:20:55.701 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:20:55 localhost nova_compute[237281]: 2025-12-06 10:20:55.702 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:20:55 localhost nova_compute[237281]: 2025-12-06 10:20:55.702 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:20:55 localhost nova_compute[237281]: 2025-12-06 10:20:55.722 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:56 localhost dnsmasq[251719]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:20:56 localhost systemd[1]: tmp-crun.0pWvnq.mount: Deactivated successfully. Dec 6 05:20:56 localhost podman[251839]: 2025-12-06 10:20:56.108757031 +0000 UTC m=+0.060770326 container kill 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:20:56 localhost ovn_controller[131684]: 2025-12-06T10:20:56Z|00220|binding|INFO|Releasing lport f6f13e22-c849-4b15-80a6-34b7dc55e6f6 from this chassis (sb_readonly=0) Dec 6 05:20:56 localhost ovn_controller[131684]: 2025-12-06T10:20:56Z|00221|binding|INFO|Setting lport f6f13e22-c849-4b15-80a6-34b7dc55e6f6 down in Southbound Dec 6 05:20:56 localhost nova_compute[237281]: 2025-12-06 10:20:56.295 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:56 localhost kernel: device tapf6f13e22-c8 left promiscuous mode Dec 6 05:20:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:56.306 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f6f13e22-c849-4b15-80a6-34b7dc55e6f6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:20:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:56.308 137259 INFO neutron.agent.ovn.metadata.agent [-] Port f6f13e22-c849-4b15-80a6-34b7dc55e6f6 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:20:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:56.310 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f47279f6-9d96-4d9c-849b-5ff8c250556a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:20:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:20:56.311 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6c241ab7-085b-45ea-a0f0-9d7fdbcb598e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:20:56 localhost nova_compute[237281]: 2025-12-06 10:20:56.321 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:56 localhost ovn_controller[131684]: 2025-12-06T10:20:56Z|00222|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:20:56 localhost nova_compute[237281]: 2025-12-06 10:20:56.646 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4885 DF PROTO=TCP SPT=36284 DPT=9102 SEQ=1379431737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC0F870000000001030307) Dec 6 05:20:57 localhost dnsmasq[251719]: exiting on receipt of SIGTERM Dec 6 05:20:57 localhost podman[251877]: 2025-12-06 10:20:57.843455527 +0000 UTC m=+0.059803927 container kill 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:20:57 localhost systemd[1]: libpod-4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118.scope: Deactivated successfully. Dec 6 05:20:57 localhost podman[251890]: 2025-12-06 10:20:57.909131402 +0000 UTC m=+0.054597534 container died 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:20:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118-userdata-shm.mount: Deactivated successfully. Dec 6 05:20:57 localhost podman[251890]: 2025-12-06 10:20:57.951894872 +0000 UTC m=+0.097360964 container cleanup 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:20:57 localhost systemd[1]: libpod-conmon-4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118.scope: Deactivated successfully. Dec 6 05:20:58 localhost podman[251897]: 2025-12-06 10:20:58.004227996 +0000 UTC m=+0.134541381 container remove 4106b097d6eb76a9f173828ade5ac840db784b1353157ca191770405c57ad118 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:20:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1762 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=609209820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC13870000000001030307) Dec 6 05:20:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:20:58.414 219384 INFO neutron.agent.dhcp.agent [None req-e7de568a-ca42-481d-89ce-40ca56a52ea8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:20:58 localhost nova_compute[237281]: 2025-12-06 10:20:58.495 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:20:58 localhost systemd[1]: var-lib-containers-storage-overlay-44f592a84a64837b0c68a11517c1cf42588b99472c5236f7c237c338d4a9df85-merged.mount: Deactivated successfully. Dec 6 05:20:58 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:21:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:21:00 localhost podman[251920]: 2025-12-06 10:21:00.54916449 +0000 UTC m=+0.080728412 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:00 localhost podman[251920]: 2025-12-06 10:21:00.61335455 +0000 UTC m=+0.144918442 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller) Dec 6 05:21:00 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:21:00 localhost nova_compute[237281]: 2025-12-06 10:21:00.725 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4886 DF PROTO=TCP SPT=36284 DPT=9102 SEQ=1379431737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC1F470000000001030307) Dec 6 05:21:01 localhost nova_compute[237281]: 2025-12-06 10:21:01.918 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:21:01 localhost nova_compute[237281]: 2025-12-06 10:21:01.939 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:21:01 localhost nova_compute[237281]: 2025-12-06 10:21:01.940 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:21:01 localhost nova_compute[237281]: 2025-12-06 10:21:01.940 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:01 localhost nova_compute[237281]: 2025-12-06 10:21:01.941 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.046 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.047 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.047 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.048 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.136 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.210 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.211 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.282 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.283 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.359 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.361 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.434 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.630 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.631 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12323MB free_disk=387.2666015625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.631 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.632 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.665 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.728 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.728 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.728 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.833 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.853 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.855 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:21:02 localhost nova_compute[237281]: 2025-12-06 10:21:02.856 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:21:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:03.420 219384 INFO neutron.agent.linux.ip_lib [None req-467a1cee-8c33-429f-bc00-f8a3eabc2fd2 - - - - - -] Device tapa96361b7-ed cannot be used as it has no MAC address#033[00m Dec 6 05:21:03 localhost nova_compute[237281]: 2025-12-06 10:21:03.478 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:03 localhost kernel: device tapa96361b7-ed entered promiscuous mode Dec 6 05:21:03 localhost NetworkManager[5965]: [1765016463.4863] manager: (tapa96361b7-ed): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Dec 6 05:21:03 localhost ovn_controller[131684]: 2025-12-06T10:21:03Z|00223|binding|INFO|Claiming lport a96361b7-edd6-42bd-9bdd-478eb48d3d85 for this chassis. Dec 6 05:21:03 localhost ovn_controller[131684]: 2025-12-06T10:21:03Z|00224|binding|INFO|a96361b7-edd6-42bd-9bdd-478eb48d3d85: Claiming unknown Dec 6 05:21:03 localhost nova_compute[237281]: 2025-12-06 10:21:03.488 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:03 localhost systemd-udevd[251967]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:21:03 localhost nova_compute[237281]: 2025-12-06 10:21:03.497 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:03.499 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe86:98b/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a96361b7-edd6-42bd-9bdd-478eb48d3d85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:03.501 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a96361b7-edd6-42bd-9bdd-478eb48d3d85 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:21:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:03.505 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ac80fb78-2239-486e-8337-d77631e5f653 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:03.506 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:03 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:03.508 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[41a0ea91-f82f-4bd2-b5f1-c2779eb5f799]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost ovn_controller[131684]: 2025-12-06T10:21:03Z|00225|binding|INFO|Setting lport a96361b7-edd6-42bd-9bdd-478eb48d3d85 ovn-installed in OVS Dec 6 05:21:03 localhost ovn_controller[131684]: 2025-12-06T10:21:03Z|00226|binding|INFO|Setting lport a96361b7-edd6-42bd-9bdd-478eb48d3d85 up in Southbound Dec 6 05:21:03 localhost nova_compute[237281]: 2025-12-06 10:21:03.523 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost journal[186952]: ethtool ioctl error on tapa96361b7-ed: No such device Dec 6 05:21:03 localhost nova_compute[237281]: 2025-12-06 10:21:03.567 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:03 localhost nova_compute[237281]: 2025-12-06 10:21:03.589 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:04 localhost podman[252038]: Dec 6 05:21:04 localhost podman[252038]: 2025-12-06 10:21:04.367026422 +0000 UTC m=+0.094193747 container create c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:21:04 localhost systemd[1]: Started libpod-conmon-c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df.scope. Dec 6 05:21:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:21:04 localhost systemd[1]: Started libcrun container. Dec 6 05:21:04 localhost podman[252038]: 2025-12-06 10:21:04.319361172 +0000 UTC m=+0.046528497 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30a8e14b370e6c2feba5df5b9a6e1b77a15e553145999835ded092a092165301/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:04 localhost podman[252038]: 2025-12-06 10:21:04.434324858 +0000 UTC m=+0.161492183 container init c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:21:04 localhost podman[252038]: 2025-12-06 10:21:04.444002417 +0000 UTC m=+0.171169752 container start c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:21:04 localhost dnsmasq[252065]: started, version 2.85 cachesize 150 Dec 6 05:21:04 localhost dnsmasq[252065]: DNS service limited to local subnets Dec 6 05:21:04 localhost dnsmasq[252065]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:04 localhost dnsmasq[252065]: warning: no upstream servers configured Dec 6 05:21:04 localhost dnsmasq[252065]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:21:04 localhost podman[252054]: 2025-12-06 10:21:04.503004047 +0000 UTC m=+0.085951983 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:04 localhost podman[252054]: 2025-12-06 10:21:04.518230247 +0000 UTC m=+0.101178183 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:04 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:21:04 localhost podman[252068]: 2025-12-06 10:21:04.604952211 +0000 UTC m=+0.130722984 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:21:04 localhost podman[252068]: 2025-12-06 10:21:04.610369859 +0000 UTC m=+0.136140651 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:21:04 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:21:04 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:04.672 219384 INFO neutron.agent.dhcp.agent [None req-4c0d4148-626f-4942-8648-f3ab2432445e - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:21:04 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:04.758 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 2001:db8::f816:3eff:fe4e:9181'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:04 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:04.761 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:04 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:04.763 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ac80fb78-2239-486e-8337-d77631e5f653 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:04 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:04.763 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:04 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:04.764 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[05fb21a0-2399-4914-a241-a2da148bad47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:04 localhost dnsmasq[252065]: exiting on receipt of SIGTERM Dec 6 05:21:04 localhost podman[252118]: 2025-12-06 10:21:04.837369992 +0000 UTC m=+0.060290921 container kill c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:21:04 localhost systemd[1]: libpod-c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df.scope: Deactivated successfully. Dec 6 05:21:04 localhost podman[252132]: 2025-12-06 10:21:04.910212489 +0000 UTC m=+0.056795843 container died c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:21:04 localhost podman[252132]: 2025-12-06 10:21:04.940796493 +0000 UTC m=+0.087379817 container cleanup c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:04 localhost systemd[1]: libpod-conmon-c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df.scope: Deactivated successfully. Dec 6 05:21:04 localhost podman[252133]: 2025-12-06 10:21:04.989084733 +0000 UTC m=+0.128255748 container remove c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:21:05 localhost systemd[1]: tmp-crun.3tLWIW.mount: Deactivated successfully. Dec 6 05:21:05 localhost systemd[1]: var-lib-containers-storage-overlay-30a8e14b370e6c2feba5df5b9a6e1b77a15e553145999835ded092a092165301-merged.mount: Deactivated successfully. Dec 6 05:21:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0a9b6cc718002873ca9aee8d57488f0f96b754b0e2172d97eb186519f9c05df-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:05 localhost nova_compute[237281]: 2025-12-06 10:21:05.754 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:05 localhost nova_compute[237281]: 2025-12-06 10:21:05.946 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:06 localhost podman[252215]: Dec 6 05:21:06 localhost podman[252215]: 2025-12-06 10:21:06.475631003 +0000 UTC m=+0.094716213 container create bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:21:06 localhost systemd[1]: Started libpod-conmon-bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b.scope. Dec 6 05:21:06 localhost podman[252215]: 2025-12-06 10:21:06.430773069 +0000 UTC m=+0.049858299 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:06 localhost systemd[1]: Started libcrun container. Dec 6 05:21:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/319e26b4cb81b78706272b3be148ef78865ccaf816a91bd2044285d44ae9d6b4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:06 localhost podman[252215]: 2025-12-06 10:21:06.55332449 +0000 UTC m=+0.172409710 container init bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:21:06 localhost podman[252215]: 2025-12-06 10:21:06.561432361 +0000 UTC m=+0.180517561 container start bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 05:21:06 localhost dnsmasq[252234]: started, version 2.85 cachesize 150 Dec 6 05:21:06 localhost dnsmasq[252234]: DNS service limited to local subnets Dec 6 05:21:06 localhost dnsmasq[252234]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:06 localhost dnsmasq[252234]: warning: no upstream servers configured Dec 6 05:21:06 localhost dnsmasq-dhcp[252234]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:06 localhost dnsmasq[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:06 localhost dnsmasq-dhcp[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:06 localhost dnsmasq-dhcp[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:06 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:06.571 2 INFO neutron.agent.securitygroups_rpc [None req-a1805b01-0f93-447d-8bef-3a61b09040f0 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:06 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:06.644 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:06Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=764fb1fd-9dc7-4021-b023-b88970f54c49, ip_allocation=immediate, mac_address=fa:16:3e:d8:36:bf, name=tempest-NetworksTestDHCPv6-452012037, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['3018cc39-ce2a-4bec-b2d0-797e3f232c57', 'e967c349-79fb-41d8-8f1a-7bdfae0b9b93'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:03Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1288, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:06Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:21:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:06.705 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:21:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:06.706 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:21:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:06.707 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:21:06 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:06.866 219384 INFO neutron.agent.dhcp.agent [None req-8a943a0c-505d-42ae-8b2e-280915fef7b0 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', 'a96361b7-edd6-42bd-9bdd-478eb48d3d85'} is completed#033[00m Dec 6 05:21:06 localhost dnsmasq[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:21:06 localhost dnsmasq-dhcp[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:06 localhost dnsmasq-dhcp[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:06 localhost podman[252253]: 2025-12-06 10:21:06.95492827 +0000 UTC m=+0.064324966 container kill bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:07 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:07.220 219384 INFO neutron.agent.dhcp.agent [None req-79684d04-a658-48f4-8d8b-28869144ef1d - - - - - -] DHCP configuration for ports {'764fb1fd-9dc7-4021-b023-b88970f54c49'} is completed#033[00m Dec 6 05:21:07 localhost ovn_controller[131684]: 2025-12-06T10:21:07Z|00227|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:21:07 localhost nova_compute[237281]: 2025-12-06 10:21:07.758 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:08 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:08.496 2 INFO neutron.agent.securitygroups_rpc [None req-b030e3e4-2fc9-422a-8235-0be7d6761eb8 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:08 localhost nova_compute[237281]: 2025-12-06 10:21:08.499 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:08 localhost dnsmasq[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:08 localhost dnsmasq-dhcp[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:08 localhost podman[252292]: 2025-12-06 10:21:08.742962773 +0000 UTC m=+0.063280064 container kill bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:21:08 localhost dnsmasq-dhcp[252234]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:08 localhost ovn_controller[131684]: 2025-12-06T10:21:08Z|00228|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:21:08 localhost nova_compute[237281]: 2025-12-06 10:21:08.897 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:09.035 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:09.036 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:21:09 localhost nova_compute[237281]: 2025-12-06 10:21:09.037 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4887 DF PROTO=TCP SPT=36284 DPT=9102 SEQ=1379431737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC3F870000000001030307) Dec 6 05:21:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:21:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:21:09 localhost podman[252315]: 2025-12-06 10:21:09.552694413 +0000 UTC m=+0.087299965 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125) Dec 6 05:21:09 localhost podman[252315]: 2025-12-06 10:21:09.561209356 +0000 UTC m=+0.095814928 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:21:09 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:21:09 localhost podman[252316]: 2025-12-06 10:21:09.606181113 +0000 UTC m=+0.135826082 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible) Dec 6 05:21:09 localhost podman[252316]: 2025-12-06 10:21:09.643604907 +0000 UTC m=+0.173249856 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:09 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:21:10 localhost dnsmasq[252234]: exiting on receipt of SIGTERM Dec 6 05:21:10 localhost podman[252371]: 2025-12-06 10:21:10.018359049 +0000 UTC m=+0.038197650 container kill bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:21:10 localhost systemd[1]: libpod-bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b.scope: Deactivated successfully. Dec 6 05:21:10 localhost podman[252384]: 2025-12-06 10:21:10.062288604 +0000 UTC m=+0.036396604 container died bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:10 localhost systemd[1]: tmp-crun.L0e2YB.mount: Deactivated successfully. Dec 6 05:21:10 localhost podman[252384]: 2025-12-06 10:21:10.089173473 +0000 UTC m=+0.063281413 container cleanup bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:21:10 localhost systemd[1]: libpod-conmon-bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b.scope: Deactivated successfully. Dec 6 05:21:10 localhost podman[252387]: 2025-12-06 10:21:10.148038719 +0000 UTC m=+0.113885525 container remove bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:10 localhost nova_compute[237281]: 2025-12-06 10:21:10.757 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:10 localhost podman[252461]: Dec 6 05:21:10 localhost podman[252461]: 2025-12-06 10:21:10.911377239 +0000 UTC m=+0.087714198 container create 92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:21:10 localhost systemd[1]: Started libpod-conmon-92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4.scope. Dec 6 05:21:10 localhost systemd[1]: Started libcrun container. Dec 6 05:21:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea39e04a8ba9d8c1cea5be05bb97c0eba072d9d0755a6526549ad20e47b9d207/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:10 localhost podman[252461]: 2025-12-06 10:21:10.869759605 +0000 UTC m=+0.046096584 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:10 localhost podman[252461]: 2025-12-06 10:21:10.971921026 +0000 UTC m=+0.148257985 container init 92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:21:10 localhost podman[252461]: 2025-12-06 10:21:10.980861482 +0000 UTC m=+0.157198441 container start 92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:10 localhost dnsmasq[252480]: started, version 2.85 cachesize 150 Dec 6 05:21:10 localhost dnsmasq[252480]: DNS service limited to local subnets Dec 6 05:21:10 localhost dnsmasq[252480]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:10 localhost dnsmasq[252480]: warning: no upstream servers configured Dec 6 05:21:10 localhost dnsmasq[252480]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:11 localhost systemd[1]: var-lib-containers-storage-overlay-319e26b4cb81b78706272b3be148ef78865ccaf816a91bd2044285d44ae9d6b4-merged.mount: Deactivated successfully. Dec 6 05:21:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bcfc8f1af7c6de11d062cdc675a1616b064904c78b7d7a14016830fea0f60f7b-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:11.209 219384 INFO neutron.agent.linux.ip_lib [None req-0b70bfe4-a7a4-47fe-9e76-c1ab3476a9cb - - - - - -] Device tapd644ff8b-ad cannot be used as it has no MAC address#033[00m Dec 6 05:21:11 localhost nova_compute[237281]: 2025-12-06 10:21:11.233 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:11 localhost kernel: device tapd644ff8b-ad entered promiscuous mode Dec 6 05:21:11 localhost NetworkManager[5965]: [1765016471.2429] manager: (tapd644ff8b-ad): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Dec 6 05:21:11 localhost nova_compute[237281]: 2025-12-06 10:21:11.243 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:11 localhost ovn_controller[131684]: 2025-12-06T10:21:11Z|00229|binding|INFO|Claiming lport d644ff8b-ad4c-42f7-9705-c4a493964de5 for this chassis. Dec 6 05:21:11 localhost ovn_controller[131684]: 2025-12-06T10:21:11Z|00230|binding|INFO|d644ff8b-ad4c-42f7-9705-c4a493964de5: Claiming unknown Dec 6 05:21:11 localhost systemd-udevd[252508]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.253 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2d28aee19a34a88ba22a95c1e6f9ff4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebf0c687-30ca-4f18-95f7-e1d37c56c3db, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d644ff8b-ad4c-42f7-9705-c4a493964de5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.255 137259 INFO neutron.agent.ovn.metadata.agent [-] Port d644ff8b-ad4c-42f7-9705-c4a493964de5 in datapath c61da4f7-8a8f-4ebf-ae35-82f9bd37360a bound to our chassis#033[00m Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.257 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.265 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[afbcce36-b44b-4e1d-a83e-1fb9766e3365]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:11 localhost ovn_controller[131684]: 2025-12-06T10:21:11Z|00231|binding|INFO|Setting lport d644ff8b-ad4c-42f7-9705-c4a493964de5 ovn-installed in OVS Dec 6 05:21:11 localhost ovn_controller[131684]: 2025-12-06T10:21:11Z|00232|binding|INFO|Setting lport d644ff8b-ad4c-42f7-9705-c4a493964de5 up in Southbound Dec 6 05:21:11 localhost nova_compute[237281]: 2025-12-06 10:21:11.292 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:11 localhost dnsmasq[252480]: exiting on receipt of SIGTERM Dec 6 05:21:11 localhost podman[252505]: 2025-12-06 10:21:11.328878929 +0000 UTC m=+0.076325666 container kill 92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:11 localhost nova_compute[237281]: 2025-12-06 10:21:11.328 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:11 localhost systemd[1]: libpod-92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4.scope: Deactivated successfully. Dec 6 05:21:11 localhost nova_compute[237281]: 2025-12-06 10:21:11.355 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:11.381 219384 INFO neutron.agent.dhcp.agent [None req-7d354526-bd56-44da-8612-14f29173bccb - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', 'a96361b7-edd6-42bd-9bdd-478eb48d3d85'} is completed#033[00m Dec 6 05:21:11 localhost podman[252527]: 2025-12-06 10:21:11.394901795 +0000 UTC m=+0.056547565 container died 92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:11 localhost podman[252527]: 2025-12-06 10:21:11.430785992 +0000 UTC m=+0.092431712 container cleanup 92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:11 localhost systemd[1]: libpod-conmon-92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4.scope: Deactivated successfully. Dec 6 05:21:11 localhost podman[252536]: 2025-12-06 10:21:11.467874466 +0000 UTC m=+0.115837184 container remove 92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:21:11 localhost nova_compute[237281]: 2025-12-06 10:21:11.480 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:11 localhost ovn_controller[131684]: 2025-12-06T10:21:11Z|00233|binding|INFO|Releasing lport a96361b7-edd6-42bd-9bdd-478eb48d3d85 from this chassis (sb_readonly=0) Dec 6 05:21:11 localhost ovn_controller[131684]: 2025-12-06T10:21:11Z|00234|binding|INFO|Setting lport a96361b7-edd6-42bd-9bdd-478eb48d3d85 down in Southbound Dec 6 05:21:11 localhost kernel: device tapa96361b7-ed left promiscuous mode Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.490 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe86:98b/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a96361b7-edd6-42bd-9bdd-478eb48d3d85) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.492 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a96361b7-edd6-42bd-9bdd-478eb48d3d85 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.494 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:11.495 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2b438110-b931-4b66-863f-fcef9e52737e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:11 localhost nova_compute[237281]: 2025-12-06 10:21:11.501 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:11.758 219384 INFO neutron.agent.dhcp.agent [None req-16c48205-f438-441d-b53c-59c13c3ec0c9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:21:12 localhost systemd[1]: var-lib-containers-storage-overlay-ea39e04a8ba9d8c1cea5be05bb97c0eba072d9d0755a6526549ad20e47b9d207-merged.mount: Deactivated successfully. Dec 6 05:21:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92d9d83b1594f992a5351c7c1db3cea51e142032901162a0d333d91510ce77b4-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:12 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:21:12 localhost podman[252602]: Dec 6 05:21:12 localhost podman[252602]: 2025-12-06 10:21:12.137823005 +0000 UTC m=+0.089690198 container create 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:12 localhost systemd[1]: Started libpod-conmon-9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560.scope. Dec 6 05:21:12 localhost podman[252602]: 2025-12-06 10:21:12.093384564 +0000 UTC m=+0.045251797 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:12 localhost systemd[1]: tmp-crun.IRTJNE.mount: Deactivated successfully. Dec 6 05:21:12 localhost systemd[1]: Started libcrun container. Dec 6 05:21:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b43e4bde78f23e8f354a607cb74de1564f2096a4acc8668c3207e7ad83cbf59e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:12 localhost podman[252602]: 2025-12-06 10:21:12.212283682 +0000 UTC m=+0.164150855 container init 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:21:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:12.220 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 2001:db8::f816:3eff:fe4e:9181'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:12.222 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:12.224 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:12.225 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ea7029-6a31-4d7f-9cb0-d532a71865a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:12 localhost podman[252602]: 2025-12-06 10:21:12.22745288 +0000 UTC m=+0.179320093 container start 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:21:12 localhost dnsmasq[252620]: started, version 2.85 cachesize 150 Dec 6 05:21:12 localhost dnsmasq[252620]: DNS service limited to local subnets Dec 6 05:21:12 localhost dnsmasq[252620]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:12 localhost dnsmasq[252620]: warning: no upstream servers configured Dec 6 05:21:12 localhost dnsmasq-dhcp[252620]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:12 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 0 addresses Dec 6 05:21:12 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:12 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:12.365 219384 INFO neutron.agent.dhcp.agent [None req-2facb260-44ae-47fd-9fff-eedd25590ffe - - - - - -] DHCP configuration for ports {'c8ab6a88-16ab-4ee9-9bdc-4af2a2c9e333'} is completed#033[00m Dec 6 05:21:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:12.988 219384 INFO neutron.agent.linux.ip_lib [None req-7ffddaec-5777-4db2-b5ba-090cc3e38e08 - - - - - -] Device tap5fb81146-9b cannot be used as it has no MAC address#033[00m Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.042 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost kernel: device tap5fb81146-9b entered promiscuous mode Dec 6 05:21:13 localhost systemd-udevd[252515]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:21:13 localhost ovn_controller[131684]: 2025-12-06T10:21:13Z|00235|binding|INFO|Claiming lport 5fb81146-9b50-4e89-8376-e79195601f2f for this chassis. Dec 6 05:21:13 localhost ovn_controller[131684]: 2025-12-06T10:21:13Z|00236|binding|INFO|5fb81146-9b50-4e89-8376-e79195601f2f: Claiming unknown Dec 6 05:21:13 localhost NetworkManager[5965]: [1765016473.0504] manager: (tap5fb81146-9b): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.050 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost ovn_controller[131684]: 2025-12-06T10:21:13Z|00237|binding|INFO|Setting lport 5fb81146-9b50-4e89-8376-e79195601f2f ovn-installed in OVS Dec 6 05:21:13 localhost ovn_controller[131684]: 2025-12-06T10:21:13Z|00238|binding|INFO|Setting lport 5fb81146-9b50-4e89-8376-e79195601f2f up in Southbound Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.058 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.062 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:13.061 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feda:30c0/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5fb81146-9b50-4e89-8376-e79195601f2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:13.064 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 5fb81146-9b50-4e89-8376-e79195601f2f in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:21:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:13.067 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 240e255f-e913-45f5-b0b3-d87c1bb02010 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:13.067 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:13.068 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[47c1c582-032e-470e-b5eb-58da9038e4f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:13.077 2 INFO neutron.agent.securitygroups_rpc [None req-938150d1-32fe-4275-8689-1f53752d3c15 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.078 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:13.106 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:12Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=287d40f7-88ca-4c97-8bf4-54d9b058f08c, ip_allocation=immediate, mac_address=fa:16:3e:fb:39:03, name=tempest-AllowedAddressPairTestJSON-1810973838, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:08Z, description=, dns_domain=, id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1983574260, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57985, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1299, status=ACTIVE, subnets=['db532af1-d0a5-4220-9df7-37ebb7b4db86'], tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:10Z, vlan_transparent=None, network_id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f'], standard_attr_id=1313, status=DOWN, tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:12Z on network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a#033[00m Dec 6 05:21:13 localhost journal[186952]: ethtool ioctl error on tap5fb81146-9b: No such device Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.123 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.152 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 1 addresses Dec 6 05:21:13 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:13 localhost podman[252676]: 2025-12-06 10:21:13.361189106 +0000 UTC m=+0.062393246 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:21:13 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:13 localhost nova_compute[237281]: 2025-12-06 10:21:13.500 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:13.622 219384 INFO neutron.agent.dhcp.agent [None req-71179644-3ac2-4bd9-a8c4-f4d8789e692f - - - - - -] DHCP configuration for ports {'287d40f7-88ca-4c97-8bf4-54d9b058f08c'} is completed#033[00m Dec 6 05:21:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:14.040 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 2001:db8::f816:3eff:fe4e:9181'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:14.042 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:14.045 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 240e255f-e913-45f5-b0b3-d87c1bb02010 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:14.045 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:14 localhost podman[252737]: Dec 6 05:21:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:14.046 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1b0cab-3ede-4520-ba68-3ded3d94ea8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:14 localhost podman[252737]: 2025-12-06 10:21:14.059159349 +0000 UTC m=+0.101521723 container create 40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:21:14 localhost systemd[1]: Started libpod-conmon-40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323.scope. Dec 6 05:21:14 localhost podman[252737]: 2025-12-06 10:21:14.013228922 +0000 UTC m=+0.055591316 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:14 localhost systemd[1]: Started libcrun container. Dec 6 05:21:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7857c60b42c56cc8959310f328cc89a830362c4d634a221b84be63676dda586e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:14 localhost podman[252737]: 2025-12-06 10:21:14.12921948 +0000 UTC m=+0.171581854 container init 40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:21:14 localhost podman[252737]: 2025-12-06 10:21:14.137467514 +0000 UTC m=+0.179829878 container start 40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:21:14 localhost dnsmasq[252756]: started, version 2.85 cachesize 150 Dec 6 05:21:14 localhost dnsmasq[252756]: DNS service limited to local subnets Dec 6 05:21:14 localhost dnsmasq[252756]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:14 localhost dnsmasq[252756]: warning: no upstream servers configured Dec 6 05:21:14 localhost dnsmasq-dhcp[252756]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:21:14 localhost dnsmasq[252756]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:14 localhost dnsmasq-dhcp[252756]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:14 localhost dnsmasq-dhcp[252756]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:14 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:14.193 2 INFO neutron.agent.securitygroups_rpc [None req-4a96682c-2b66-4bd4-80b8-b4d9f774bc85 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:14.299 219384 INFO neutron.agent.dhcp.agent [None req-537cbbb9-f486-43ff-a738-75063a4b9b51 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:21:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:14.433 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:13Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=05091806-ca8f-4d53-b8ad-723e3058b49c, ip_allocation=immediate, mac_address=fa:16:3e:28:5c:8e, name=tempest-AllowedAddressPairTestJSON-1644363862, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:08Z, description=, dns_domain=, id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1983574260, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57985, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1299, status=ACTIVE, subnets=['db532af1-d0a5-4220-9df7-37ebb7b4db86'], tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:10Z, vlan_transparent=None, network_id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f'], standard_attr_id=1320, status=DOWN, tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:14Z on network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a#033[00m Dec 6 05:21:14 localhost dnsmasq[252756]: exiting on receipt of SIGTERM Dec 6 05:21:14 localhost podman[252785]: 2025-12-06 10:21:14.592927346 +0000 UTC m=+0.059623330 container kill 40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:21:14 localhost systemd[1]: libpod-40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323.scope: Deactivated successfully. Dec 6 05:21:14 localhost podman[252809]: 2025-12-06 10:21:14.652032549 +0000 UTC m=+0.047342101 container died 40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:21:14 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 2 addresses Dec 6 05:21:14 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:14 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:14 localhost podman[252802]: 2025-12-06 10:21:14.67542011 +0000 UTC m=+0.077663856 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:14 localhost podman[252809]: 2025-12-06 10:21:14.784770794 +0000 UTC m=+0.180080266 container cleanup 40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:21:14 localhost systemd[1]: libpod-conmon-40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323.scope: Deactivated successfully. Dec 6 05:21:14 localhost podman[252816]: 2025-12-06 10:21:14.807888188 +0000 UTC m=+0.192293674 container remove 40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:14.984 219384 INFO neutron.agent.dhcp.agent [None req-37133866-54ae-4896-964d-f27a779222c0 - - - - - -] DHCP configuration for ports {'05091806-ca8f-4d53-b8ad-723e3058b49c'} is completed#033[00m Dec 6 05:21:15 localhost systemd[1]: var-lib-containers-storage-overlay-7857c60b42c56cc8959310f328cc89a830362c4d634a221b84be63676dda586e-merged.mount: Deactivated successfully. Dec 6 05:21:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40531b08c85bad648a06622e953a92ed2433a1e8cc14354dde13c07718d55323-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:15 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:15.103 2 INFO neutron.agent.securitygroups_rpc [None req-781831ed-3f4b-4cef-953d-35c0abe1e430 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:15 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:15.733 2 INFO neutron.agent.securitygroups_rpc [None req-026a3613-5595-40d2-aca4-668938d401ab 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:15 localhost nova_compute[237281]: 2025-12-06 10:21:15.782 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:15 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 1 addresses Dec 6 05:21:15 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:15 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:15 localhost podman[252902]: 2025-12-06 10:21:15.992573015 +0000 UTC m=+0.061365743 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:21:16 localhost podman[252938]: Dec 6 05:21:16 localhost podman[252938]: 2025-12-06 10:21:16.160752633 +0000 UTC m=+0.088198711 container create 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 05:21:16 localhost openstack_network_exporter[199751]: ERROR 10:21:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:21:16 localhost openstack_network_exporter[199751]: ERROR 10:21:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:21:16 localhost openstack_network_exporter[199751]: ERROR 10:21:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:21:16 localhost openstack_network_exporter[199751]: ERROR 10:21:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:21:16 localhost openstack_network_exporter[199751]: Dec 6 05:21:16 localhost openstack_network_exporter[199751]: ERROR 10:21:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:21:16 localhost openstack_network_exporter[199751]: Dec 6 05:21:16 localhost systemd[1]: Started libpod-conmon-9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9.scope. Dec 6 05:21:16 localhost systemd[1]: Started libcrun container. Dec 6 05:21:16 localhost podman[252938]: 2025-12-06 10:21:16.117494029 +0000 UTC m=+0.044940137 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b15930ed84cec7c0f8cd5be25421a7d17a7aa6eaf601ab329e4e8d817b9f565/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:16 localhost podman[252938]: 2025-12-06 10:21:16.231778295 +0000 UTC m=+0.159224413 container init 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:16 localhost sshd[252961]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:21:16 localhost podman[252938]: 2025-12-06 10:21:16.24166091 +0000 UTC m=+0.169106988 container start 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:21:16 localhost dnsmasq[252963]: started, version 2.85 cachesize 150 Dec 6 05:21:16 localhost dnsmasq[252963]: DNS service limited to local subnets Dec 6 05:21:16 localhost dnsmasq[252963]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:16 localhost dnsmasq[252963]: warning: no upstream servers configured Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:21:16 localhost dnsmasq[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:16 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:16.303 219384 INFO neutron.agent.dhcp.agent [None req-2f478e88-2c60-40f0-a447-430a9ab22e0d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:14Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=a9ab972d-e54a-458f-bae1-8c704bcdc096, ip_allocation=immediate, mac_address=fa:16:3e:37:4f:7d, name=tempest-NetworksTestDHCPv6-1078996116, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['1af4b293-e827-4bbc-9446-f50d8a2b14f0', '6fb5db6b-e613-46c3-9a1f-6c3f117ae3d1'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:12Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1326, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:15Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:21:16 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:16.439 219384 INFO neutron.agent.dhcp.agent [None req-63d86c1b-33cc-424d-a233-bc51821011ce - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '5fb81146-9b50-4e89-8376-e79195601f2f'} is completed#033[00m Dec 6 05:21:16 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:16.515 2 INFO neutron.agent.securitygroups_rpc [None req-62b7a6f4-7c15-44ca-a3ba-41387072fa92 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:16 localhost dnsmasq[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:21:16 localhost podman[252981]: 2025-12-06 10:21:16.543096669 +0000 UTC m=+0.057646760 container kill 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:16 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:16.776 219384 INFO neutron.agent.dhcp.agent [None req-c62c1a4c-8538-4960-9eb3-5c3b033a2465 - - - - - -] DHCP configuration for ports {'a9ab972d-e54a-458f-bae1-8c704bcdc096'} is completed#033[00m Dec 6 05:21:16 localhost dnsmasq[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:16 localhost dnsmasq-dhcp[252963]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:16 localhost podman[253020]: 2025-12-06 10:21:16.893742477 +0000 UTC m=+0.044615508 container kill 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:21:16 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:16.981 2 INFO neutron.agent.securitygroups_rpc [None req-e653e95c-d8e4-4982-9d63-8def139f2c7f 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:17 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:17.011 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:16Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0acb1843-e770-4d40-9d9c-39b0fc48397a, ip_allocation=immediate, mac_address=fa:16:3e:a9:ca:b3, name=tempest-AllowedAddressPairTestJSON-1519448146, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:08Z, description=, dns_domain=, id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1983574260, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57985, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1299, status=ACTIVE, subnets=['db532af1-d0a5-4220-9df7-37ebb7b4db86'], tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:10Z, vlan_transparent=None, network_id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f'], standard_attr_id=1334, status=DOWN, tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:16Z on network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a#033[00m Dec 6 05:21:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:17.038 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:21:17 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 2 addresses Dec 6 05:21:17 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:17 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:17 localhost podman[253058]: 2025-12-06 10:21:17.224917723 +0000 UTC m=+0.063942562 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:21:17 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:17.441 219384 INFO neutron.agent.dhcp.agent [None req-d375a264-8663-46aa-afd4-2c485448d5de - - - - - -] DHCP configuration for ports {'0acb1843-e770-4d40-9d9c-39b0fc48397a'} is completed#033[00m Dec 6 05:21:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:21:18 localhost nova_compute[237281]: 2025-12-06 10:21:18.541 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:18 localhost systemd[1]: tmp-crun.dR9STE.mount: Deactivated successfully. Dec 6 05:21:18 localhost podman[253080]: 2025-12-06 10:21:18.602798612 +0000 UTC m=+0.127767922 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350) Dec 6 05:21:18 localhost podman[253080]: 2025-12-06 10:21:18.648198182 +0000 UTC m=+0.173167522 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:21:18 localhost dnsmasq[252963]: exiting on receipt of SIGTERM Dec 6 05:21:18 localhost podman[253107]: 2025-12-06 10:21:18.653374452 +0000 UTC m=+0.074593742 container kill 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:21:18 localhost systemd[1]: tmp-crun.xNiaFu.mount: Deactivated successfully. Dec 6 05:21:18 localhost systemd[1]: libpod-9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9.scope: Deactivated successfully. Dec 6 05:21:18 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:21:18 localhost podman[253128]: 2025-12-06 10:21:18.72365908 +0000 UTC m=+0.052880462 container died 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:21:18 localhost podman[253128]: 2025-12-06 10:21:18.7547611 +0000 UTC m=+0.083982432 container cleanup 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:21:18 localhost systemd[1]: libpod-conmon-9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9.scope: Deactivated successfully. Dec 6 05:21:18 localhost podman[253130]: 2025-12-06 10:21:18.805187225 +0000 UTC m=+0.128382541 container remove 9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:21:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:18.826 2 INFO neutron.agent.securitygroups_rpc [None req-5ec22cb9-789f-440d-a042-33abf1302327 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:19 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 1 addresses Dec 6 05:21:19 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:19 localhost podman[253189]: 2025-12-06 10:21:19.08733646 +0000 UTC m=+0.061055535 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:19 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:19 localhost systemd[1]: var-lib-containers-storage-overlay-5b15930ed84cec7c0f8cd5be25421a7d17a7aa6eaf601ab329e4e8d817b9f565-merged.mount: Deactivated successfully. Dec 6 05:21:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d841012260a290aca5762c3d6ede5f056c4983a5c6d1e8cf7c7db05ad175aa9-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:19 localhost podman[253247]: Dec 6 05:21:19 localhost podman[253247]: 2025-12-06 10:21:19.656329274 +0000 UTC m=+0.058532527 container create d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:19 localhost systemd[1]: Started libpod-conmon-d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31.scope. Dec 6 05:21:19 localhost systemd[1]: Started libcrun container. Dec 6 05:21:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89c7312b716e18ddf94e9fb27597a3d335412d5cedec9db9416b85ec6a46aab1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:19 localhost podman[253247]: 2025-12-06 10:21:19.716228082 +0000 UTC m=+0.118431335 container init d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:21:19 localhost podman[253247]: 2025-12-06 10:21:19.726294052 +0000 UTC m=+0.128497315 container start d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 05:21:19 localhost podman[253247]: 2025-12-06 10:21:19.626893876 +0000 UTC m=+0.029097149 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:19 localhost dnsmasq[253265]: started, version 2.85 cachesize 150 Dec 6 05:21:19 localhost dnsmasq[253265]: DNS service limited to local subnets Dec 6 05:21:19 localhost dnsmasq[253265]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:19 localhost dnsmasq[253265]: warning: no upstream servers configured Dec 6 05:21:19 localhost dnsmasq-dhcp[253265]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:19 localhost dnsmasq[253265]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:19 localhost dnsmasq-dhcp[253265]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:19 localhost dnsmasq-dhcp[253265]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:20.048 219384 INFO neutron.agent.dhcp.agent [None req-2050cf86-f345-41e9-b176-057413284205 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '5fb81146-9b50-4e89-8376-e79195601f2f'} is completed#033[00m Dec 6 05:21:20 localhost dnsmasq[253265]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:20 localhost dnsmasq-dhcp[253265]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:20 localhost dnsmasq-dhcp[253265]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:20 localhost podman[253283]: 2025-12-06 10:21:20.195910721 +0000 UTC m=+0.047493857 container kill d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:20 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:20.334 2 INFO neutron.agent.securitygroups_rpc [None req-a5a7092d-b070-4860-afdb-235189b2878e 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:20.388 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:20Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9bb52dc0-aedc-41d7-bab3-580578c1b05f, ip_allocation=immediate, mac_address=fa:16:3e:f7:9a:33, name=tempest-AllowedAddressPairTestJSON-1544077602, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:08Z, description=, dns_domain=, id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1983574260, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57985, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1299, status=ACTIVE, subnets=['db532af1-d0a5-4220-9df7-37ebb7b4db86'], tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:10Z, vlan_transparent=None, network_id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f'], standard_attr_id=1341, status=DOWN, tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:20Z on network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a#033[00m Dec 6 05:21:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:20.510 219384 INFO neutron.agent.dhcp.agent [None req-bfd195ec-7ad1-4b3d-ae81-50b6f08299bc - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '5fb81146-9b50-4e89-8376-e79195601f2f'} is completed#033[00m Dec 6 05:21:20 localhost podman[253318]: 2025-12-06 10:21:20.576946556 +0000 UTC m=+0.038828689 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:20 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 2 addresses Dec 6 05:21:20 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:20 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:20 localhost systemd[1]: tmp-crun.r0P0bP.mount: Deactivated successfully. Dec 6 05:21:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:20.784 219384 INFO neutron.agent.dhcp.agent [None req-20961b59-a676-4f0b-8309-5e3123561bb1 - - - - - -] DHCP configuration for ports {'9bb52dc0-aedc-41d7-bab3-580578c1b05f'} is completed#033[00m Dec 6 05:21:20 localhost nova_compute[237281]: 2025-12-06 10:21:20.785 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:20.810 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:20.811 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:20.813 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 240e255f-e913-45f5-b0b3-d87c1bb02010 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:20.813 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:20.814 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[0690c7c7-f215-44f9-a2f8-0c3155938229]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:21:21 localhost podman[253338]: 2025-12-06 10:21:21.549358365 +0000 UTC m=+0.082763825 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:21:21 localhost podman[253338]: 2025-12-06 10:21:21.5605426 +0000 UTC m=+0.093948050 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:21:21 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:21:22 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:22.556 2 INFO neutron.agent.securitygroups_rpc [None req-78b1ece2-2c2d-48b7-b054-e35f4b68d02d 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:22 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 1 addresses Dec 6 05:21:22 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:22 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:22 localhost podman[253378]: 2025-12-06 10:21:22.81207614 +0000 UTC m=+0.060646382 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:21:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:22.886 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:22.888 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:22.890 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 240e255f-e913-45f5-b0b3-d87c1bb02010 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:22.890 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:22.891 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[171f796a-4ec6-4819-a996-4f6587a2c8c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:23 localhost podman[197801]: time="2025-12-06T10:21:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:21:23 localhost podman[197801]: @ - - [06/Dec/2025:10:21:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147798 "" "Go-http-client/1.1" Dec 6 05:21:23 localhost podman[197801]: @ - - [06/Dec/2025:10:21:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16903 "" "Go-http-client/1.1" Dec 6 05:21:23 localhost nova_compute[237281]: 2025-12-06 10:21:23.543 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:23 localhost dnsmasq[253265]: exiting on receipt of SIGTERM Dec 6 05:21:23 localhost podman[253415]: 2025-12-06 10:21:23.644963705 +0000 UTC m=+0.065531323 container kill d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:21:23 localhost systemd[1]: libpod-d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31.scope: Deactivated successfully. Dec 6 05:21:23 localhost podman[253429]: 2025-12-06 10:21:23.713029564 +0000 UTC m=+0.053088669 container died d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:21:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:23 localhost podman[253429]: 2025-12-06 10:21:23.744718963 +0000 UTC m=+0.084778028 container cleanup d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:23 localhost systemd[1]: libpod-conmon-d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31.scope: Deactivated successfully. Dec 6 05:21:23 localhost podman[253431]: 2025-12-06 10:21:23.787241744 +0000 UTC m=+0.122015435 container remove d624d2be4c6d823fbd00e5aef707235ff98bc70185689f984d884549109efe31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:21:23 localhost systemd[1]: var-lib-containers-storage-overlay-89c7312b716e18ddf94e9fb27597a3d335412d5cedec9db9416b85ec6a46aab1-merged.mount: Deactivated successfully. Dec 6 05:21:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31706 DF PROTO=TCP SPT=50314 DPT=9102 SEQ=3962457516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC78AA0000000001030307) Dec 6 05:21:24 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:24.046 2 INFO neutron.agent.securitygroups_rpc [None req-8d4cd66c-57ea-4071-b692-5f24c237cb37 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:24 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:24.086 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:23Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f7a80a35-4cdd-41ab-b3aa-ddbaedbf4681, ip_allocation=immediate, mac_address=fa:16:3e:de:ae:69, name=tempest-AllowedAddressPairTestJSON-340964024, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:08Z, description=, dns_domain=, id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1983574260, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57985, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1299, status=ACTIVE, subnets=['db532af1-d0a5-4220-9df7-37ebb7b4db86'], tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:10Z, vlan_transparent=None, network_id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f'], standard_attr_id=1355, status=DOWN, tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:23Z on network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a#033[00m Dec 6 05:21:24 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 2 addresses Dec 6 05:21:24 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:24 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:24 localhost podman[253472]: 2025-12-06 10:21:24.327646296 +0000 UTC m=+0.054027858 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:24 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:24.570 219384 INFO neutron.agent.dhcp.agent [None req-7f867975-d4bd-418b-90f4-21b50511f774 - - - - - -] DHCP configuration for ports {'f7a80a35-4cdd-41ab-b3aa-ddbaedbf4681'} is completed#033[00m Dec 6 05:21:24 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:24.709 2 INFO neutron.agent.securitygroups_rpc [None req-4f185159-53b0-4d5c-b2c2-78d129b80850 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31707 DF PROTO=TCP SPT=50314 DPT=9102 SEQ=3962457516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC7CC70000000001030307) Dec 6 05:21:25 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:25.098 2 INFO neutron.agent.securitygroups_rpc [None req-8ce96594-182f-4511-83b0-21f522230e8b 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:25 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:25.127 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:24Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=af1692ae-5fde-4447-b8e3-0f596d9b52d2, ip_allocation=immediate, mac_address=fa:16:3e:85:22:12, name=tempest-AllowedAddressPairTestJSON-1402934545, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:08Z, description=, dns_domain=, id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1983574260, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57985, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1299, status=ACTIVE, subnets=['db532af1-d0a5-4220-9df7-37ebb7b4db86'], tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:10Z, vlan_transparent=None, network_id=c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, port_security_enabled=True, project_id=c2d28aee19a34a88ba22a95c1e6f9ff4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f'], standard_attr_id=1357, status=DOWN, tags=[], tenant_id=c2d28aee19a34a88ba22a95c1e6f9ff4, updated_at=2025-12-06T10:21:24Z on network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a#033[00m Dec 6 05:21:25 localhost podman[253557]: Dec 6 05:21:25 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 3 addresses Dec 6 05:21:25 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:25 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:25 localhost podman[253569]: 2025-12-06 10:21:25.384565022 +0000 UTC m=+0.058357141 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:21:25 localhost podman[253557]: 2025-12-06 10:21:25.428670953 +0000 UTC m=+0.139658489 container create 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:25 localhost podman[253557]: 2025-12-06 10:21:25.33650389 +0000 UTC m=+0.047491476 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:25 localhost systemd[1]: Started libpod-conmon-9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2.scope. Dec 6 05:21:25 localhost systemd[1]: Started libcrun container. Dec 6 05:21:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0625d8a3bff2264a76d03acbf572f9592c519739a4a02c653a608a1c6a29adef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:25 localhost podman[253557]: 2025-12-06 10:21:25.499831598 +0000 UTC m=+0.210819144 container init 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:21:25 localhost podman[253557]: 2025-12-06 10:21:25.506181734 +0000 UTC m=+0.217169270 container start 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:21:25 localhost dnsmasq[253590]: started, version 2.85 cachesize 150 Dec 6 05:21:25 localhost dnsmasq[253590]: DNS service limited to local subnets Dec 6 05:21:25 localhost dnsmasq[253590]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:25 localhost dnsmasq[253590]: warning: no upstream servers configured Dec 6 05:21:25 localhost dnsmasq-dhcp[253590]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:25 localhost dnsmasq[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:25 localhost dnsmasq-dhcp[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:25 localhost dnsmasq-dhcp[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:25 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:25.573 219384 INFO neutron.agent.dhcp.agent [None req-c11fbef3-55a7-404f-bdce-2fc09e82a65d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:24Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=2bc80345-56a3-4e4e-b228-ec0d87144482, ip_allocation=immediate, mac_address=fa:16:3e:fe:92:69, name=tempest-NetworksTestDHCPv6-1965361961, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['326de2c5-75f2-438b-86ca-3316649d46a3', 'be7c4edb-0e59-4a95-ba6d-d106533ef0c3'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:21Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1356, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:24Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:21:25 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:25.704 219384 INFO neutron.agent.dhcp.agent [None req-4beb44fd-2f64-4663-a827-c1586baf12d6 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '5fb81146-9b50-4e89-8376-e79195601f2f'} is completed#033[00m Dec 6 05:21:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4888 DF PROTO=TCP SPT=36284 DPT=9102 SEQ=1379431737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC7F880000000001030307) Dec 6 05:21:25 localhost nova_compute[237281]: 2025-12-06 10:21:25.824 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:25 localhost dnsmasq[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:21:25 localhost dnsmasq-dhcp[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:25 localhost podman[253615]: 2025-12-06 10:21:25.844714408 +0000 UTC m=+0.094140025 container kill 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:25 localhost dnsmasq-dhcp[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:25 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:25.861 219384 INFO neutron.agent.dhcp.agent [None req-bdb7a921-9688-40f3-9d92-948f06d02f7b - - - - - -] DHCP configuration for ports {'af1692ae-5fde-4447-b8e3-0f596d9b52d2'} is completed#033[00m Dec 6 05:21:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:26.107 219384 INFO neutron.agent.dhcp.agent [None req-13acafed-cbde-4618-be97-fae935174d7e - - - - - -] DHCP configuration for ports {'2bc80345-56a3-4e4e-b228-ec0d87144482'} is completed#033[00m Dec 6 05:21:26 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:26.156 2 INFO neutron.agent.securitygroups_rpc [None req-f2c79d77-eeaf-44a7-b2af-bc012b3da7e3 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:26 localhost dnsmasq[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:26 localhost podman[253656]: 2025-12-06 10:21:26.403068864 +0000 UTC m=+0.057789844 container kill 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:21:26 localhost dnsmasq-dhcp[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:26 localhost dnsmasq-dhcp[253590]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31708 DF PROTO=TCP SPT=50314 DPT=9102 SEQ=3962457516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC84C70000000001030307) Dec 6 05:21:27 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:27.328 2 INFO neutron.agent.securitygroups_rpc [None req-26f2fb28-55d8-41ce-a061-175f4bc5233d 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:27 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 2 addresses Dec 6 05:21:27 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:27 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:27 localhost podman[253694]: 2025-12-06 10:21:27.57310397 +0000 UTC m=+0.066492402 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:21:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45070 DF PROTO=TCP SPT=35404 DPT=9102 SEQ=405234843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC87870000000001030307) Dec 6 05:21:27 localhost dnsmasq[253590]: exiting on receipt of SIGTERM Dec 6 05:21:27 localhost podman[253732]: 2025-12-06 10:21:27.996912465 +0000 UTC m=+0.059306941 container kill 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:21:27 localhost systemd[1]: libpod-9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2.scope: Deactivated successfully. Dec 6 05:21:28 localhost podman[253744]: 2025-12-06 10:21:28.067956677 +0000 UTC m=+0.058460885 container died 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:28 localhost podman[253744]: 2025-12-06 10:21:28.099122528 +0000 UTC m=+0.089626706 container cleanup 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 05:21:28 localhost systemd[1]: libpod-conmon-9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2.scope: Deactivated successfully. Dec 6 05:21:28 localhost podman[253751]: 2025-12-06 10:21:28.138774951 +0000 UTC m=+0.118875868 container remove 9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:28 localhost systemd[1]: var-lib-containers-storage-overlay-0625d8a3bff2264a76d03acbf572f9592c519739a4a02c653a608a1c6a29adef-merged.mount: Deactivated successfully. Dec 6 05:21:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c59239c3675191bf5e9dab48bd54a1063f03ef980b645d5c5321fe0d476b4a2-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:28 localhost nova_compute[237281]: 2025-12-06 10:21:28.576 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:28.751 2 INFO neutron.agent.securitygroups_rpc [None req-6497afad-46c7-44d6-b9e1-9db3b154e13c 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:29 localhost podman[253834]: Dec 6 05:21:29 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 1 addresses Dec 6 05:21:29 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:29 localhost podman[253847]: 2025-12-06 10:21:29.077418739 +0000 UTC m=+0.062545480 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:29 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:29 localhost podman[253834]: 2025-12-06 10:21:29.115913617 +0000 UTC m=+0.148836163 container create ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:21:29 localhost podman[253834]: 2025-12-06 10:21:29.017397978 +0000 UTC m=+0.050320554 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:29 localhost systemd[1]: Started libpod-conmon-ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a.scope. Dec 6 05:21:29 localhost systemd[1]: Started libcrun container. Dec 6 05:21:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baaadb64ec7be5909e5fb728927961efe89ed81635e183ce8847583131d9e1d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:29 localhost podman[253834]: 2025-12-06 10:21:29.189239169 +0000 UTC m=+0.222161735 container init ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:21:29 localhost podman[253834]: 2025-12-06 10:21:29.199122224 +0000 UTC m=+0.232044780 container start ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:21:29 localhost dnsmasq[253872]: started, version 2.85 cachesize 150 Dec 6 05:21:29 localhost dnsmasq[253872]: DNS service limited to local subnets Dec 6 05:21:29 localhost dnsmasq[253872]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:29 localhost dnsmasq[253872]: warning: no upstream servers configured Dec 6 05:21:29 localhost dnsmasq-dhcp[253872]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:29 localhost dnsmasq[253872]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:29 localhost dnsmasq-dhcp[253872]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:29 localhost dnsmasq-dhcp[253872]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:29 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:29.505 219384 INFO neutron.agent.dhcp.agent [None req-5337a1f5-54f4-4267-ae08-d7eaedcd31aa - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '5fb81146-9b50-4e89-8376-e79195601f2f'} is completed#033[00m Dec 6 05:21:29 localhost dnsmasq[253872]: exiting on receipt of SIGTERM Dec 6 05:21:29 localhost podman[253890]: 2025-12-06 10:21:29.544647784 +0000 UTC m=+0.059243999 container kill ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:29 localhost systemd[1]: libpod-ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a.scope: Deactivated successfully. Dec 6 05:21:29 localhost podman[253905]: 2025-12-06 10:21:29.615460638 +0000 UTC m=+0.053836142 container died ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:21:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:29 localhost podman[253905]: 2025-12-06 10:21:29.649278561 +0000 UTC m=+0.087654015 container cleanup ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:21:29 localhost systemd[1]: libpod-conmon-ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a.scope: Deactivated successfully. Dec 6 05:21:29 localhost podman[253906]: 2025-12-06 10:21:29.688940616 +0000 UTC m=+0.123418109 container remove ee5218a1c13dfaad2c708c38487798ce1c74ad8d9094f3a24a0877691583f98a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:21:29 localhost ovn_controller[131684]: 2025-12-06T10:21:29Z|00239|binding|INFO|Releasing lport 5fb81146-9b50-4e89-8376-e79195601f2f from this chassis (sb_readonly=0) Dec 6 05:21:29 localhost kernel: device tap5fb81146-9b left promiscuous mode Dec 6 05:21:29 localhost ovn_controller[131684]: 2025-12-06T10:21:29Z|00240|binding|INFO|Setting lport 5fb81146-9b50-4e89-8376-e79195601f2f down in Southbound Dec 6 05:21:29 localhost nova_compute[237281]: 2025-12-06 10:21:29.745 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:29.758 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:feda:30c0/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5fb81146-9b50-4e89-8376-e79195601f2f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:29.761 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 5fb81146-9b50-4e89-8376-e79195601f2f in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:21:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:29.763 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:29.764 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9dddfa03-a2d6-4506-850c-7cf0ac5b8d08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:29 localhost nova_compute[237281]: 2025-12-06 10:21:29.765 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:29 localhost nova_compute[237281]: 2025-12-06 10:21:29.767 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:30.357 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:30.359 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:30.361 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:30.362 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[546966df-a211-42b9-9eab-6d8e8e8fff4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:30 localhost systemd[1]: var-lib-containers-storage-overlay-baaadb64ec7be5909e5fb728927961efe89ed81635e183ce8847583131d9e1d0-merged.mount: Deactivated successfully. Dec 6 05:21:30 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:21:30 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:30.671 2 INFO neutron.agent.securitygroups_rpc [None req-7291b312-aa36-4480-8b10-390dcf27ea9c 0fdc93b4fbdc4a0489b2d32163fd23bd c2d28aee19a34a88ba22a95c1e6f9ff4 - - default default] Security group member updated ['9e3d2d2c-cfba-4299-841d-6aeb55cb0f7f']#033[00m Dec 6 05:21:30 localhost nova_compute[237281]: 2025-12-06 10:21:30.854 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:21:30 localhost podman[253952]: 2025-12-06 10:21:30.939822245 +0000 UTC m=+0.060458106 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:21:30 localhost dnsmasq[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/addn_hosts - 0 addresses Dec 6 05:21:30 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/host Dec 6 05:21:30 localhost dnsmasq-dhcp[252620]: read /var/lib/neutron/dhcp/c61da4f7-8a8f-4ebf-ae35-82f9bd37360a/opts Dec 6 05:21:30 localhost podman[253963]: 2025-12-06 10:21:30.977749185 +0000 UTC m=+0.075151489 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller) Dec 6 05:21:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:30.992 219384 INFO neutron.agent.linux.ip_lib [None req-8ffcf1c7-20bc-49e9-b7da-fba75799d673 - - - - - -] Device tapcd51097f-9d cannot be used as it has no MAC address#033[00m Dec 6 05:21:31 localhost nova_compute[237281]: 2025-12-06 10:21:31.028 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:31 localhost kernel: device tapcd51097f-9d entered promiscuous mode Dec 6 05:21:31 localhost NetworkManager[5965]: [1765016491.0370] manager: (tapcd51097f-9d): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Dec 6 05:21:31 localhost nova_compute[237281]: 2025-12-06 10:21:31.037 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:31 localhost ovn_controller[131684]: 2025-12-06T10:21:31Z|00241|binding|INFO|Claiming lport cd51097f-9d22-4c16-a737-d03b2eb59af8 for this chassis. Dec 6 05:21:31 localhost ovn_controller[131684]: 2025-12-06T10:21:31Z|00242|binding|INFO|cd51097f-9d22-4c16-a737-d03b2eb59af8: Claiming unknown Dec 6 05:21:31 localhost systemd-udevd[254005]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:21:31 localhost podman[253963]: 2025-12-06 10:21:31.048822188 +0000 UTC m=+0.146224542 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible) Dec 6 05:21:31 localhost ovn_controller[131684]: 2025-12-06T10:21:31Z|00243|binding|INFO|Setting lport cd51097f-9d22-4c16-a737-d03b2eb59af8 ovn-installed in OVS Dec 6 05:21:31 localhost ovn_controller[131684]: 2025-12-06T10:21:31Z|00244|binding|INFO|Setting lport cd51097f-9d22-4c16-a737-d03b2eb59af8 up in Southbound Dec 6 05:21:31 localhost nova_compute[237281]: 2025-12-06 10:21:31.055 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:31.056 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cd51097f-9d22-4c16-a737-d03b2eb59af8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:31.058 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cd51097f-9d22-4c16-a737-d03b2eb59af8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:21:31 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:21:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:31.063 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ee9eb450-445f-4d87-980a-c355173d6a3c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:31.064 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:31.065 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd2a919-359c-4dee-870d-dcdd0b570338]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:31 localhost nova_compute[237281]: 2025-12-06 10:21:31.069 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost nova_compute[237281]: 2025-12-06 10:21:31.075 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost journal[186952]: ethtool ioctl error on tapcd51097f-9d: No such device Dec 6 05:21:31 localhost nova_compute[237281]: 2025-12-06 10:21:31.116 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31709 DF PROTO=TCP SPT=50314 DPT=9102 SEQ=3962457516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDC94870000000001030307) Dec 6 05:21:31 localhost nova_compute[237281]: 2025-12-06 10:21:31.141 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:31 localhost dnsmasq[252620]: exiting on receipt of SIGTERM Dec 6 05:21:31 localhost systemd[1]: libpod-9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560.scope: Deactivated successfully. Dec 6 05:21:31 localhost podman[254073]: 2025-12-06 10:21:31.867483953 +0000 UTC m=+0.065245393 container kill 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:31 localhost podman[254092]: 2025-12-06 10:21:31.93770831 +0000 UTC m=+0.058536847 container died 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:21:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:31 localhost podman[254092]: 2025-12-06 10:21:31.970181132 +0000 UTC m=+0.091009589 container cleanup 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:21:31 localhost systemd[1]: libpod-conmon-9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560.scope: Deactivated successfully. Dec 6 05:21:32 localhost podman[254094]: 2025-12-06 10:21:32.012181788 +0000 UTC m=+0.122977945 container remove 9824178c7f54961e323ec5d1e26852a4029062d564f5a421b1ef14d314c33560 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:21:32 localhost ovn_controller[131684]: 2025-12-06T10:21:32Z|00245|binding|INFO|Releasing lport d644ff8b-ad4c-42f7-9705-c4a493964de5 from this chassis (sb_readonly=0) Dec 6 05:21:32 localhost kernel: device tapd644ff8b-ad left promiscuous mode Dec 6 05:21:32 localhost ovn_controller[131684]: 2025-12-06T10:21:32Z|00246|binding|INFO|Setting lport d644ff8b-ad4c-42f7-9705-c4a493964de5 down in Southbound Dec 6 05:21:32 localhost nova_compute[237281]: 2025-12-06 10:21:32.068 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.077 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c61da4f7-8a8f-4ebf-ae35-82f9bd37360a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c2d28aee19a34a88ba22a95c1e6f9ff4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebf0c687-30ca-4f18-95f7-e1d37c56c3db, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d644ff8b-ad4c-42f7-9705-c4a493964de5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.079 137259 INFO neutron.agent.ovn.metadata.agent [-] Port d644ff8b-ad4c-42f7-9705-c4a493964de5 in datapath c61da4f7-8a8f-4ebf-ae35-82f9bd37360a unbound from our chassis#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.082 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c61da4f7-8a8f-4ebf-ae35-82f9bd37360a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.082 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2771cb-8138-4494-b812-c6ed9b9e28fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:32 localhost nova_compute[237281]: 2025-12-06 10:21:32.085 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:32 localhost podman[254141]: Dec 6 05:21:32 localhost podman[254141]: 2025-12-06 10:21:32.191425078 +0000 UTC m=+0.086374956 container create eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:21:32 localhost systemd[1]: Started libpod-conmon-eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4.scope. Dec 6 05:21:32 localhost systemd[1]: Started libcrun container. Dec 6 05:21:32 localhost podman[254141]: 2025-12-06 10:21:32.149243786 +0000 UTC m=+0.044193684 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf90ee638249b6d8e4e10fc40717958febee6fd7208b413f83ae163691cc9956/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:32 localhost podman[254141]: 2025-12-06 10:21:32.263949335 +0000 UTC m=+0.158899223 container init eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:32 localhost podman[254141]: 2025-12-06 10:21:32.272663804 +0000 UTC m=+0.167613692 container start eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:32 localhost dnsmasq[254160]: started, version 2.85 cachesize 150 Dec 6 05:21:32 localhost dnsmasq[254160]: DNS service limited to local subnets Dec 6 05:21:32 localhost dnsmasq[254160]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:32 localhost dnsmasq[254160]: warning: no upstream servers configured Dec 6 05:21:32 localhost dnsmasq-dhcp[254160]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:32 localhost dnsmasq[254160]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:32 localhost dnsmasq-dhcp[254160]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:32 localhost dnsmasq-dhcp[254160]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:32 localhost ovn_controller[131684]: 2025-12-06T10:21:32Z|00247|binding|INFO|Releasing lport cd51097f-9d22-4c16-a737-d03b2eb59af8 from this chassis (sb_readonly=0) Dec 6 05:21:32 localhost nova_compute[237281]: 2025-12-06 10:21:32.362 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:32 localhost ovn_controller[131684]: 2025-12-06T10:21:32Z|00248|binding|INFO|Setting lport cd51097f-9d22-4c16-a737-d03b2eb59af8 down in Southbound Dec 6 05:21:32 localhost kernel: device tapcd51097f-9d left promiscuous mode Dec 6 05:21:32 localhost nova_compute[237281]: 2025-12-06 10:21:32.381 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.451 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cd51097f-9d22-4c16-a737-d03b2eb59af8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.453 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cd51097f-9d22-4c16-a737-d03b2eb59af8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.455 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.456 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2c207dba-4b12-46ef-88e8-2ca2a113a40a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:32.579 219384 INFO neutron.agent.dhcp.agent [None req-aa610107-3bda-4f50-a45a-51ded43eb98b - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:21:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:32.681 219384 INFO neutron.agent.dhcp.agent [None req-2ea9ab56-6b02-4d69-aeb3-90edc4e8b21d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:21:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:32.684 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.789 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.791 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.793 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:32.794 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9c86d0f3-79ce-43fb-9f34-05a49f51a284]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:32 localhost systemd[1]: var-lib-containers-storage-overlay-b43e4bde78f23e8f354a607cb74de1564f2096a4acc8668c3207e7ad83cbf59e-merged.mount: Deactivated successfully. Dec 6 05:21:32 localhost systemd[1]: run-netns-qdhcp\x2dc61da4f7\x2d8a8f\x2d4ebf\x2dae35\x2d82f9bd37360a.mount: Deactivated successfully. Dec 6 05:21:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:32.985 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:21:33 localhost ovn_controller[131684]: 2025-12-06T10:21:33Z|00249|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.353 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.578 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:33 localhost dnsmasq[254160]: exiting on receipt of SIGTERM Dec 6 05:21:33 localhost podman[254178]: 2025-12-06 10:21:33.613261442 +0000 UTC m=+0.046209457 container kill eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:21:33 localhost systemd[1]: libpod-eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4.scope: Deactivated successfully. Dec 6 05:21:33 localhost podman[254192]: 2025-12-06 10:21:33.685715647 +0000 UTC m=+0.057227996 container died eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:33 localhost podman[254192]: 2025-12-06 10:21:33.716193687 +0000 UTC m=+0.087705976 container cleanup eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:21:33 localhost systemd[1]: libpod-conmon-eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4.scope: Deactivated successfully. Dec 6 05:21:33 localhost podman[254193]: 2025-12-06 10:21:33.757548443 +0000 UTC m=+0.124271734 container remove eaa972f60c7d8a72cc1c66135f5a5b117e9e962279ac078511e199f6973d8de4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:33 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:33.810 219384 INFO neutron.agent.linux.ip_lib [None req-478bc055-0f47-4cb4-9eb3-5a8089b89216 - - - - - -] Device tapcd51097f-9d cannot be used as it has no MAC address#033[00m Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.834 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:33 localhost kernel: device tapcd51097f-9d entered promiscuous mode Dec 6 05:21:33 localhost ovn_controller[131684]: 2025-12-06T10:21:33Z|00250|binding|INFO|Claiming lport cd51097f-9d22-4c16-a737-d03b2eb59af8 for this chassis. Dec 6 05:21:33 localhost systemd-udevd[254007]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:21:33 localhost ovn_controller[131684]: 2025-12-06T10:21:33Z|00251|binding|INFO|cd51097f-9d22-4c16-a737-d03b2eb59af8: Claiming unknown Dec 6 05:21:33 localhost NetworkManager[5965]: [1765016493.8421] manager: (tapcd51097f-9d): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.842 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:33.858 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fea9:ed21/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cd51097f-9d22-4c16-a737-d03b2eb59af8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:33.861 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cd51097f-9d22-4c16-a737-d03b2eb59af8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:21:33 localhost systemd[1]: var-lib-containers-storage-overlay-cf90ee638249b6d8e4e10fc40717958febee6fd7208b413f83ae163691cc9956-merged.mount: Deactivated successfully. Dec 6 05:21:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:33.865 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ee9eb450-445f-4d87-980a-c355173d6a3c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:33.866 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:33.868 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9feb92-0f11-49e6-a097-676225f3560d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:33 localhost ovn_controller[131684]: 2025-12-06T10:21:33Z|00252|binding|INFO|Setting lport cd51097f-9d22-4c16-a737-d03b2eb59af8 ovn-installed in OVS Dec 6 05:21:33 localhost ovn_controller[131684]: 2025-12-06T10:21:33Z|00253|binding|INFO|Setting lport cd51097f-9d22-4c16-a737-d03b2eb59af8 up in Southbound Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.889 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.892 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.917 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:33 localhost nova_compute[237281]: 2025-12-06 10:21:33.942 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:34 localhost podman[254281]: Dec 6 05:21:34 localhost podman[254281]: 2025-12-06 10:21:34.781687048 +0000 UTC m=+0.092585867 container create 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:21:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:21:34 localhost systemd[1]: Started libpod-conmon-200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638.scope. Dec 6 05:21:34 localhost podman[254281]: 2025-12-06 10:21:34.736074021 +0000 UTC m=+0.046972870 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:34 localhost systemd[1]: Started libcrun container. Dec 6 05:21:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f215a94c37ce3795442036b6b342696bd08819a44aa40e6d6a7d3fd10b3287d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:34 localhost podman[254281]: 2025-12-06 10:21:34.85789215 +0000 UTC m=+0.168790969 container init 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:34 localhost systemd[1]: tmp-crun.kZK0Ij.mount: Deactivated successfully. Dec 6 05:21:34 localhost podman[254281]: 2025-12-06 10:21:34.87122636 +0000 UTC m=+0.182125229 container start 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:21:34 localhost dnsmasq[254319]: started, version 2.85 cachesize 150 Dec 6 05:21:34 localhost dnsmasq[254319]: DNS service limited to local subnets Dec 6 05:21:34 localhost dnsmasq[254319]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:34 localhost dnsmasq[254319]: warning: no upstream servers configured Dec 6 05:21:34 localhost dnsmasq-dhcp[254319]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:34 localhost dnsmasq-dhcp[254319]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:21:34 localhost dnsmasq[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:34 localhost dnsmasq-dhcp[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:34 localhost dnsmasq-dhcp[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:34 localhost podman[254295]: 2025-12-06 10:21:34.937864976 +0000 UTC m=+0.115591686 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:21:34 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:34.968 2 INFO neutron.agent.securitygroups_rpc [None req-9196b127-83d0-4fdf-a52e-aecb6bb28160 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:34 localhost podman[254295]: 2025-12-06 10:21:34.973169126 +0000 UTC m=+0.150895816 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:21:34 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:21:34 localhost podman[254296]: 2025-12-06 10:21:34.988264021 +0000 UTC m=+0.163509765 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:34 localhost podman[254296]: 2025-12-06 10:21:34.997128545 +0000 UTC m=+0.172374329 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:21:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:35.019 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:34Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=8bf53c21-2b2b-440f-aaab-992490ddca51, ip_allocation=immediate, mac_address=fa:16:3e:47:9c:87, name=tempest-NetworksTestDHCPv6-1590789258, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['5da4261e-c845-4c3b-9781-6e2d4ae692a5', 'c5aaf315-ae4f-4cbd-a5e0-8e92ff246916'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:31Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1388, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:34Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:21:35 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:21:35 localhost dnsmasq[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:21:35 localhost podman[254358]: 2025-12-06 10:21:35.287438871 +0000 UTC m=+0.054725330 container kill 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:21:35 localhost dnsmasq-dhcp[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:35 localhost dnsmasq-dhcp[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:35.341 219384 INFO neutron.agent.dhcp.agent [None req-6d941111-8956-49d8-83ce-39cbd098db21 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', 'cd51097f-9d22-4c16-a737-d03b2eb59af8'} is completed#033[00m Dec 6 05:21:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:35.622 219384 INFO neutron.agent.dhcp.agent [None req-5dfb395e-405e-40b6-b2da-554bd35ddee6 - - - - - -] DHCP configuration for ports {'8bf53c21-2b2b-440f-aaab-992490ddca51'} is completed#033[00m Dec 6 05:21:35 localhost nova_compute[237281]: 2025-12-06 10:21:35.895 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:36 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:36.640 2 INFO neutron.agent.securitygroups_rpc [None req-5011901f-f181-4231-8d45-a0246cc2fc7e a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:37 localhost dnsmasq[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:37 localhost dnsmasq-dhcp[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:37 localhost podman[254398]: 2025-12-06 10:21:37.185979801 +0000 UTC m=+0.061838408 container kill 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:21:37 localhost dnsmasq-dhcp[254319]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:38 localhost nova_compute[237281]: 2025-12-06 10:21:38.580 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:39 localhost dnsmasq[254319]: exiting on receipt of SIGTERM Dec 6 05:21:39 localhost podman[254439]: 2025-12-06 10:21:39.075900545 +0000 UTC m=+0.045119293 container kill 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 6 05:21:39 localhost systemd[1]: libpod-200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638.scope: Deactivated successfully. Dec 6 05:21:39 localhost podman[254452]: 2025-12-06 10:21:39.142425018 +0000 UTC m=+0.055831514 container died 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:39 localhost podman[254452]: 2025-12-06 10:21:39.169950467 +0000 UTC m=+0.083356913 container cleanup 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:39 localhost systemd[1]: libpod-conmon-200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638.scope: Deactivated successfully. Dec 6 05:21:39 localhost podman[254459]: 2025-12-06 10:21:39.232646421 +0000 UTC m=+0.133010064 container remove 200bba5fbb11e348c4155d76d08a98a2d47ccaeaa85970c72c7fd41634f07638 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31710 DF PROTO=TCP SPT=50314 DPT=9102 SEQ=3962457516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDCB5880000000001030307) Dec 6 05:21:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:21:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:21:40 localhost podman[254528]: Dec 6 05:21:40 localhost podman[254540]: 2025-12-06 10:21:40.058068256 +0000 UTC m=+0.085007433 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:21:40 localhost systemd[1]: var-lib-containers-storage-overlay-0f215a94c37ce3795442036b6b342696bd08819a44aa40e6d6a7d3fd10b3287d-merged.mount: Deactivated successfully. Dec 6 05:21:40 localhost podman[254528]: 2025-12-06 10:21:39.982271097 +0000 UTC m=+0.038701275 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:40 localhost podman[254539]: 2025-12-06 10:21:40.100043491 +0000 UTC m=+0.128882037 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent) Dec 6 05:21:40 localhost podman[254539]: 2025-12-06 10:21:40.106324184 +0000 UTC m=+0.135162750 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent) Dec 6 05:21:40 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:21:40 localhost podman[254528]: 2025-12-06 10:21:40.1249675 +0000 UTC m=+0.181397618 container create 526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:21:40 localhost systemd[1]: Started libpod-conmon-526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d.scope. Dec 6 05:21:40 localhost podman[254540]: 2025-12-06 10:21:40.150748605 +0000 UTC m=+0.177687822 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:21:40 localhost systemd[1]: Started libcrun container. Dec 6 05:21:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a782c0543aa7c97aac4aa1be40607d40cd2578d80f24916cb78394d0475321b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:40 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:21:40 localhost podman[254528]: 2025-12-06 10:21:40.169803513 +0000 UTC m=+0.226233631 container init 526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 05:21:40 localhost podman[254528]: 2025-12-06 10:21:40.178322366 +0000 UTC m=+0.234752504 container start 526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:21:40 localhost dnsmasq[254581]: started, version 2.85 cachesize 150 Dec 6 05:21:40 localhost dnsmasq[254581]: DNS service limited to local subnets Dec 6 05:21:40 localhost dnsmasq[254581]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:40 localhost dnsmasq[254581]: warning: no upstream servers configured Dec 6 05:21:40 localhost dnsmasq-dhcp[254581]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:21:40 localhost dnsmasq[254581]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:40 localhost dnsmasq-dhcp[254581]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:21:40 localhost dnsmasq-dhcp[254581]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:21:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:40.384 219384 INFO neutron.agent.dhcp.agent [None req-9cf01985-2d9f-480c-9ae6-56c0937ac5a9 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', 'cd51097f-9d22-4c16-a737-d03b2eb59af8'} is completed#033[00m Dec 6 05:21:40 localhost dnsmasq[254581]: exiting on receipt of SIGTERM Dec 6 05:21:40 localhost podman[254600]: 2025-12-06 10:21:40.501205187 +0000 UTC m=+0.055678149 container kill 526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:21:40 localhost systemd[1]: libpod-526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d.scope: Deactivated successfully. Dec 6 05:21:40 localhost podman[254614]: 2025-12-06 10:21:40.578345207 +0000 UTC m=+0.061553600 container died 526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:21:40 localhost podman[254614]: 2025-12-06 10:21:40.608186067 +0000 UTC m=+0.091394400 container cleanup 526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:21:40 localhost systemd[1]: libpod-conmon-526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d.scope: Deactivated successfully. Dec 6 05:21:40 localhost podman[254615]: 2025-12-06 10:21:40.654224408 +0000 UTC m=+0.130894230 container remove 526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:21:40 localhost ovn_controller[131684]: 2025-12-06T10:21:40Z|00254|binding|INFO|Releasing lport cd51097f-9d22-4c16-a737-d03b2eb59af8 from this chassis (sb_readonly=0) Dec 6 05:21:40 localhost ovn_controller[131684]: 2025-12-06T10:21:40Z|00255|binding|INFO|Setting lport cd51097f-9d22-4c16-a737-d03b2eb59af8 down in Southbound Dec 6 05:21:40 localhost kernel: device tapcd51097f-9d left promiscuous mode Dec 6 05:21:40 localhost nova_compute[237281]: 2025-12-06 10:21:40.712 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:40.721 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fea9:ed21/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cd51097f-9d22-4c16-a737-d03b2eb59af8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:40.723 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cd51097f-9d22-4c16-a737-d03b2eb59af8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:21:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:40.725 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:40 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:40.727 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5305e54c-68d5-46b0-be9a-5ecd719505c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:40 localhost nova_compute[237281]: 2025-12-06 10:21:40.740 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:40 localhost nova_compute[237281]: 2025-12-06 10:21:40.897 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:40.984 219384 INFO neutron.agent.dhcp.agent [None req-7e851aa4-4c55-4f1a-b44b-674b3fa8336f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:21:41 localhost systemd[1]: var-lib-containers-storage-overlay-4a782c0543aa7c97aac4aa1be40607d40cd2578d80f24916cb78394d0475321b-merged.mount: Deactivated successfully. Dec 6 05:21:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-526549c188341d69d224baef82d5a7af8dc280b5e0f51a3f740a0e386545f40d-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:41 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:21:43 localhost nova_compute[237281]: 2025-12-06 10:21:43.582 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:43 localhost nova_compute[237281]: 2025-12-06 10:21:43.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:43 localhost nova_compute[237281]: 2025-12-06 10:21:43.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:45 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:45.444 2 INFO neutron.agent.securitygroups_rpc [None req-98d9481f-b454-4c2e-8eaf-13dab4d914bd 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:21:45 localhost nova_compute[237281]: 2025-12-06 10:21:45.900 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:46 localhost openstack_network_exporter[199751]: ERROR 10:21:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:21:46 localhost openstack_network_exporter[199751]: ERROR 10:21:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:21:46 localhost openstack_network_exporter[199751]: ERROR 10:21:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:21:46 localhost openstack_network_exporter[199751]: ERROR 10:21:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:21:46 localhost openstack_network_exporter[199751]: Dec 6 05:21:46 localhost openstack_network_exporter[199751]: ERROR 10:21:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:21:46 localhost openstack_network_exporter[199751]: Dec 6 05:21:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:46.925 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 2001:db8::f816:3eff:fe4e:9181'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 10.100.0.2 2001:db8::f816:3eff:fe4e:9181'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:46.927 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:21:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:46.930 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:46.931 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[474e2f51-afa7-48d7-b7f1-7cee4848c9ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:47 localhost nova_compute[237281]: 2025-12-06 10:21:47.901 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:47 localhost nova_compute[237281]: 2025-12-06 10:21:47.902 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:48 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:48.156 219384 INFO neutron.agent.linux.ip_lib [None req-c8efc806-f67f-4d34-8e9e-45d06d5fc138 - - - - - -] Device tap671fc3a7-1b cannot be used as it has no MAC address#033[00m Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.181 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost kernel: device tap671fc3a7-1b entered promiscuous mode Dec 6 05:21:48 localhost NetworkManager[5965]: [1765016508.1901] manager: (tap671fc3a7-1b): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.190 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost ovn_controller[131684]: 2025-12-06T10:21:48Z|00256|binding|INFO|Claiming lport 671fc3a7-1b79-440e-9a34-ca7067db8d29 for this chassis. Dec 6 05:21:48 localhost ovn_controller[131684]: 2025-12-06T10:21:48Z|00257|binding|INFO|671fc3a7-1b79-440e-9a34-ca7067db8d29: Claiming unknown Dec 6 05:21:48 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:48.193 2 INFO neutron.agent.securitygroups_rpc [None req-0a7def39-83e5-4e5a-9277-4115b0fb949e a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:48 localhost systemd-udevd[254652]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:21:48 localhost ovn_controller[131684]: 2025-12-06T10:21:48Z|00258|binding|INFO|Setting lport 671fc3a7-1b79-440e-9a34-ca7067db8d29 ovn-installed in OVS Dec 6 05:21:48 localhost ovn_controller[131684]: 2025-12-06T10:21:48Z|00259|binding|INFO|Setting lport 671fc3a7-1b79-440e-9a34-ca7067db8d29 up in Southbound Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.203 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:48.207 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe77:b0db/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=671fc3a7-1b79-440e-9a34-ca7067db8d29) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.208 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:48.210 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 671fc3a7-1b79-440e-9a34-ca7067db8d29 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:21:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:48.215 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port cf7e8ae4-a7de-4d0a-9af7-1936d0489c13 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:21:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:48.215 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:48.216 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[74d59a71-2a3e-44c6-be10-1f8401c18e73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.231 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost journal[186952]: ethtool ioctl error on tap671fc3a7-1b: No such device Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.270 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.298 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:48.329 2 INFO neutron.agent.securitygroups_rpc [None req-a803d118-df79-425d-a706-b120e8173bc3 f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:21:48 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:48.551 2 INFO neutron.agent.securitygroups_rpc [None req-8b4928a6-90ac-4808-a1e8-9f15d41418c4 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:21:48 localhost nova_compute[237281]: 2025-12-06 10:21:48.618 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:48 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:48.860 2 INFO neutron.agent.securitygroups_rpc [None req-8b4928a6-90ac-4808-a1e8-9f15d41418c4 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:21:49 localhost podman[254723]: Dec 6 05:21:49 localhost podman[254723]: 2025-12-06 10:21:49.124993005 +0000 UTC m=+0.094286720 container create 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:21:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:21:49 localhost systemd[1]: Started libpod-conmon-2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1.scope. Dec 6 05:21:49 localhost podman[254723]: 2025-12-06 10:21:49.079248443 +0000 UTC m=+0.048542218 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:21:49 localhost systemd[1]: Started libcrun container. Dec 6 05:21:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aebdff84c85a0142f53918b68a7451a3228e91086a81a7d32274be3f20aa8db7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:21:49 localhost podman[254723]: 2025-12-06 10:21:49.207685696 +0000 UTC m=+0.176979411 container init 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:21:49 localhost podman[254723]: 2025-12-06 10:21:49.21689093 +0000 UTC m=+0.186184645 container start 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 05:21:49 localhost dnsmasq[254753]: started, version 2.85 cachesize 150 Dec 6 05:21:49 localhost dnsmasq[254753]: DNS service limited to local subnets Dec 6 05:21:49 localhost dnsmasq[254753]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:21:49 localhost dnsmasq[254753]: warning: no upstream servers configured Dec 6 05:21:49 localhost dnsmasq[254753]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:49 localhost podman[254738]: 2025-12-06 10:21:49.257174833 +0000 UTC m=+0.085082507 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.) Dec 6 05:21:49 localhost podman[254738]: 2025-12-06 10:21:49.272468274 +0000 UTC m=+0.100375968 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7) Dec 6 05:21:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:49.289 219384 INFO neutron.agent.dhcp.agent [None req-c8efc806-f67f-4d34-8e9e-45d06d5fc138 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:47Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=47652a42-1eb4-4b4f-a49e-a13c8332773d, ip_allocation=immediate, mac_address=fa:16:3e:62:08:6b, name=tempest-NetworksTestDHCPv6-1040202451, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['ff5c7605-06d6-40da-b419-b103a8302745'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:46Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1462, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:47Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:21:49 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:21:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:49.379 219384 INFO neutron.agent.dhcp.agent [None req-d5013c98-5890-4d93-8417-e3cdb9aa28a1 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:21:49 localhost podman[254779]: 2025-12-06 10:21:49.46419092 +0000 UTC m=+0.050281833 container kill 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:21:49 localhost dnsmasq[254753]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:21:50 localhost systemd[1]: tmp-crun.ZRGetl.mount: Deactivated successfully. Dec 6 05:21:50 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:50.196 219384 INFO neutron.agent.dhcp.agent [None req-b99e0a8d-ee62-4fe4-a085-7422c064c614 - - - - - -] DHCP configuration for ports {'47652a42-1eb4-4b4f-a49e-a13c8332773d'} is completed#033[00m Dec 6 05:21:50 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:50.274 2 INFO neutron.agent.securitygroups_rpc [None req-535b527a-ca01-49b0-940d-b206fc12c8af f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:21:50 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:50.873 2 INFO neutron.agent.securitygroups_rpc [None req-bb22eb1c-e01b-407e-a790-38b902d0f663 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:21:50 localhost nova_compute[237281]: 2025-12-06 10:21:50.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:50 localhost nova_compute[237281]: 2025-12-06 10:21:50.902 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:50 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:50.924 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:21:50 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:50.994 2 INFO neutron.agent.securitygroups_rpc [None req-f721a4f9-2ef2-44d7-ad2c-3e31245dcdb7 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:21:51 localhost dnsmasq[254753]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:21:51 localhost podman[254816]: 2025-12-06 10:21:51.216833599 +0000 UTC m=+0.065487251 container kill 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:21:51 localhost nova_compute[237281]: 2025-12-06 10:21:51.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:21:52 localhost podman[254838]: 2025-12-06 10:21:52.542741115 +0000 UTC m=+0.079909667 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:21:52 localhost podman[254838]: 2025-12-06 10:21:52.551112143 +0000 UTC m=+0.088280735 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:21:52 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:21:52 localhost nova_compute[237281]: 2025-12-06 10:21:52.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:52 localhost nova_compute[237281]: 2025-12-06 10:21:52.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:21:53 localhost neutron_sriov_agent[212548]: 2025-12-06 10:21:53.256 2 INFO neutron.agent.securitygroups_rpc [None req-a347ff39-d1c1-4f08-b7d2-6492ea85a6ed 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:21:53 localhost podman[197801]: time="2025-12-06T10:21:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:21:53 localhost podman[197801]: @ - - [06/Dec/2025:10:21:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 145869 "" "Go-http-client/1.1" Dec 6 05:21:53 localhost podman[197801]: @ - - [06/Dec/2025:10:21:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16428 "" "Go-http-client/1.1" Dec 6 05:21:53 localhost nova_compute[237281]: 2025-12-06 10:21:53.622 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:53 localhost nova_compute[237281]: 2025-12-06 10:21:53.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:53 localhost nova_compute[237281]: 2025-12-06 10:21:53.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 05:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14213 DF PROTO=TCP SPT=34126 DPT=9102 SEQ=880347751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDCEF160000000001030307) Dec 6 05:21:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14214 DF PROTO=TCP SPT=34126 DPT=9102 SEQ=880347751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDCF3080000000001030307) Dec 6 05:21:55 localhost sshd[254860]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:21:55 localhost podman[254876]: 2025-12-06 10:21:55.907347974 +0000 UTC m=+0.067170084 container kill 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 05:21:55 localhost dnsmasq[254753]: exiting on receipt of SIGTERM Dec 6 05:21:55 localhost systemd[1]: tmp-crun.f2qTmJ.mount: Deactivated successfully. Dec 6 05:21:55 localhost nova_compute[237281]: 2025-12-06 10:21:55.909 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:55 localhost systemd[1]: libpod-2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1.scope: Deactivated successfully. Dec 6 05:21:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31711 DF PROTO=TCP SPT=50314 DPT=9102 SEQ=3962457516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDCF5870000000001030307) Dec 6 05:21:55 localhost podman[254889]: 2025-12-06 10:21:55.988047052 +0000 UTC m=+0.064916213 container died 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:21:56 localhost podman[254889]: 2025-12-06 10:21:56.018587125 +0000 UTC m=+0.095456216 container cleanup 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:21:56 localhost systemd[1]: libpod-conmon-2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1.scope: Deactivated successfully. Dec 6 05:21:56 localhost podman[254891]: 2025-12-06 10:21:56.064244944 +0000 UTC m=+0.134995796 container remove 2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:21:56 localhost ovn_controller[131684]: 2025-12-06T10:21:56Z|00260|binding|INFO|Releasing lport 671fc3a7-1b79-440e-9a34-ca7067db8d29 from this chassis (sb_readonly=0) Dec 6 05:21:56 localhost ovn_controller[131684]: 2025-12-06T10:21:56Z|00261|binding|INFO|Setting lport 671fc3a7-1b79-440e-9a34-ca7067db8d29 down in Southbound Dec 6 05:21:56 localhost nova_compute[237281]: 2025-12-06 10:21:56.080 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:56 localhost kernel: device tap671fc3a7-1b left promiscuous mode Dec 6 05:21:56 localhost nova_compute[237281]: 2025-12-06 10:21:56.100 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:56.108 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=671fc3a7-1b79-440e-9a34-ca7067db8d29) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:21:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:56.110 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 671fc3a7-1b79-440e-9a34-ca7067db8d29 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:21:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:56.112 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:21:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:21:56.113 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[482d0398-f405-4173-8866-9b2322bdfe3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:21:56 localhost nova_compute[237281]: 2025-12-06 10:21:56.488 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:21:56 localhost nova_compute[237281]: 2025-12-06 10:21:56.489 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:21:56 localhost nova_compute[237281]: 2025-12-06 10:21:56.489 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:21:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:56.795 219384 INFO neutron.agent.dhcp.agent [None req-51d2a860-1cf2-459b-be42-21dba7ec667a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:21:56 localhost systemd[1]: var-lib-containers-storage-overlay-aebdff84c85a0142f53918b68a7451a3228e91086a81a7d32274be3f20aa8db7-merged.mount: Deactivated successfully. Dec 6 05:21:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2eb389d06976e3189a0edd990717d5b13cdac2d79052719db19eb2dedea54be1-userdata-shm.mount: Deactivated successfully. Dec 6 05:21:56 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:21:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14215 DF PROTO=TCP SPT=34126 DPT=9102 SEQ=880347751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDCFB3E0000000001030307) Dec 6 05:21:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4889 DF PROTO=TCP SPT=36284 DPT=9102 SEQ=1379431737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDCFD880000000001030307) Dec 6 05:21:58 localhost nova_compute[237281]: 2025-12-06 10:21:58.623 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:21:58 localhost nova_compute[237281]: 2025-12-06 10:21:58.650 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:21:58 localhost nova_compute[237281]: 2025-12-06 10:21:58.651 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:21:58 localhost nova_compute[237281]: 2025-12-06 10:21:58.651 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:21:58 localhost nova_compute[237281]: 2025-12-06 10:21:58.651 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:21:59 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:21:59.195 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:00 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:00.627 2 INFO neutron.agent.securitygroups_rpc [None req-8cea618e-7896-4502-bac1-db2939f319d4 846cb84ebf0c4340865d754a154550aa 606d49ab85ab4a8096bd0ae42ab3aae9 - - default default] Security group member updated ['0cb0d4f7-ca05-4a96-8a43-bcaf19afc97d']#033[00m Dec 6 05:22:00 localhost nova_compute[237281]: 2025-12-06 10:22:00.907 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:22:01 localhost podman[254918]: 2025-12-06 10:22:01.55060283 +0000 UTC m=+0.083119766 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible) Dec 6 05:22:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14216 DF PROTO=TCP SPT=34126 DPT=9102 SEQ=880347751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD0B880000000001030307) Dec 6 05:22:01 localhost podman[254918]: 2025-12-06 10:22:01.622343553 +0000 UTC m=+0.154860479 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:22:01 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:22:02 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:02.270 219384 INFO neutron.agent.linux.ip_lib [None req-129d0460-e1ab-41d6-8289-f83aac349de8 - - - - - -] Device tap44ac8284-a9 cannot be used as it has no MAC address#033[00m Dec 6 05:22:02 localhost nova_compute[237281]: 2025-12-06 10:22:02.295 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:02 localhost kernel: device tap44ac8284-a9 entered promiscuous mode Dec 6 05:22:02 localhost nova_compute[237281]: 2025-12-06 10:22:02.302 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:02 localhost ovn_controller[131684]: 2025-12-06T10:22:02Z|00262|binding|INFO|Claiming lport 44ac8284-a9aa-4cf4-a200-1a17d49f236d for this chassis. Dec 6 05:22:02 localhost ovn_controller[131684]: 2025-12-06T10:22:02Z|00263|binding|INFO|44ac8284-a9aa-4cf4-a200-1a17d49f236d: Claiming unknown Dec 6 05:22:02 localhost NetworkManager[5965]: [1765016522.3039] manager: (tap44ac8284-a9): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Dec 6 05:22:02 localhost systemd-udevd[254955]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:22:02 localhost ovn_controller[131684]: 2025-12-06T10:22:02Z|00264|binding|INFO|Setting lport 44ac8284-a9aa-4cf4-a200-1a17d49f236d ovn-installed in OVS Dec 6 05:22:02 localhost ovn_controller[131684]: 2025-12-06T10:22:02Z|00265|binding|INFO|Setting lport 44ac8284-a9aa-4cf4-a200-1a17d49f236d up in Southbound Dec 6 05:22:02 localhost nova_compute[237281]: 2025-12-06 10:22:02.312 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:02.312 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe85:cd14/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=44ac8284-a9aa-4cf4-a200-1a17d49f236d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:02 localhost nova_compute[237281]: 2025-12-06 10:22:02.314 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:02.315 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 44ac8284-a9aa-4cf4-a200-1a17d49f236d in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:22:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:02.318 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7a036edf-ea0b-433c-a693-20e516b8c6e8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:22:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:02.318 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:02.319 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[78afe19b-a967-4469-aaa2-af29c9291167]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:02.423 2 INFO neutron.agent.securitygroups_rpc [None req-d8bcab41-aa9b-4eba-96b4-1c38a954a203 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:02 localhost nova_compute[237281]: 2025-12-06 10:22:02.422 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost journal[186952]: ethtool ioctl error on tap44ac8284-a9: No such device Dec 6 05:22:02 localhost nova_compute[237281]: 2025-12-06 10:22:02.447 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:02 localhost nova_compute[237281]: 2025-12-06 10:22:02.463 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:02 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:02.910 2 INFO neutron.agent.securitygroups_rpc [None req-f4fcfc2d-ed5d-4fb1-9e90-5b064d0f664b 846cb84ebf0c4340865d754a154550aa 606d49ab85ab4a8096bd0ae42ab3aae9 - - default default] Security group member updated ['0cb0d4f7-ca05-4a96-8a43-bcaf19afc97d']#033[00m Dec 6 05:22:03 localhost podman[255026]: Dec 6 05:22:03 localhost podman[255026]: 2025-12-06 10:22:03.224634755 +0000 UTC m=+0.089213534 container create 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:03 localhost systemd[1]: Started libpod-conmon-4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4.scope. Dec 6 05:22:03 localhost systemd[1]: Started libcrun container. Dec 6 05:22:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75eb76d70806674c41da1a0bad7f6e4f69b19201d7a7e42adde5fef4842fca0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:03 localhost podman[255026]: 2025-12-06 10:22:03.283059866 +0000 UTC m=+0.147638675 container init 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:22:03 localhost podman[255026]: 2025-12-06 10:22:03.189969625 +0000 UTC m=+0.054548464 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:03 localhost podman[255026]: 2025-12-06 10:22:03.290097084 +0000 UTC m=+0.154675893 container start 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:22:03 localhost dnsmasq[255044]: started, version 2.85 cachesize 150 Dec 6 05:22:03 localhost dnsmasq[255044]: DNS service limited to local subnets Dec 6 05:22:03 localhost dnsmasq[255044]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:03 localhost dnsmasq[255044]: warning: no upstream servers configured Dec 6 05:22:03 localhost dnsmasq-dhcp[255044]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:22:03 localhost dnsmasq[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:03 localhost dnsmasq-dhcp[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:03 localhost dnsmasq-dhcp[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:03.347 219384 INFO neutron.agent.dhcp.agent [None req-129d0460-e1ab-41d6-8289-f83aac349de8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:01Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d7a5f6f7-6009-45a4-8f9d-deb0a0749d97, ip_allocation=immediate, mac_address=fa:16:3e:b5:87:6e, name=tempest-NetworksTestDHCPv6-746061690, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['6363ff7e-81e6-454d-bc29-ab16a60a4f90'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:21:58Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1515, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:02Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:22:03 localhost dnsmasq[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:22:03 localhost dnsmasq-dhcp[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:03 localhost dnsmasq-dhcp[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:03 localhost podman[255063]: 2025-12-06 10:22:03.538937001 +0000 UTC m=+0.062426787 container kill 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:22:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:03.557 219384 INFO neutron.agent.dhcp.agent [None req-a6344e7c-83d0-443c-a540-a080a762edf3 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:22:03 localhost nova_compute[237281]: 2025-12-06 10:22:03.625 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:03.842 219384 INFO neutron.agent.dhcp.agent [None req-8d8c9b59-f356-4921-afaf-0b20f6035fe6 - - - - - -] DHCP configuration for ports {'d7a5f6f7-6009-45a4-8f9d-deb0a0749d97'} is completed#033[00m Dec 6 05:22:05 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:05.034 2 INFO neutron.agent.securitygroups_rpc [None req-108519db-c721-42de-87e4-26fab89b5730 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.059 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.119 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.120 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.121 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.122 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.153 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.153 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.154 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.154 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.256 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:22:05 localhost dnsmasq[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:05 localhost dnsmasq-dhcp[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:05 localhost dnsmasq-dhcp[255044]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:05 localhost podman[255101]: 2025-12-06 10:22:05.296200932 +0000 UTC m=+0.071798477 container kill 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:22:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.341 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.343 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:22:05 localhost podman[255114]: 2025-12-06 10:22:05.410038884 +0000 UTC m=+0.085715436 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.412 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.414 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:22:05 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:05.418 2 INFO neutron.agent.securitygroups_rpc [None req-f28c8ea8-1a5b-4fd9-b4b9-f0c1abe7d465 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:05 localhost podman[255114]: 2025-12-06 10:22:05.420270209 +0000 UTC m=+0.095946751 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:22:05 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:22:05 localhost podman[255115]: 2025-12-06 10:22:05.4698955 +0000 UTC m=+0.141864287 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.486 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.487 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:22:05 localhost podman[255115]: 2025-12-06 10:22:05.51038154 +0000 UTC m=+0.182350357 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:22:05 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.554 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:22:05 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:05.589 2 INFO neutron.agent.securitygroups_rpc [None req-0f3f0f61-c93f-4fe8-ab5b-7d9c9ca19e71 846cb84ebf0c4340865d754a154550aa 606d49ab85ab4a8096bd0ae42ab3aae9 - - default default] Security group member updated ['0cb0d4f7-ca05-4a96-8a43-bcaf19afc97d']#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.794 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.796 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12297MB free_disk=387.2664031982422GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.796 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.797 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:22:05 localhost nova_compute[237281]: 2025-12-06 10:22:05.943 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:06 localhost nova_compute[237281]: 2025-12-06 10:22:06.273 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:22:06 localhost nova_compute[237281]: 2025-12-06 10:22:06.275 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:22:06 localhost nova_compute[237281]: 2025-12-06 10:22:06.275 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:22:06 localhost nova_compute[237281]: 2025-12-06 10:22:06.670 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 05:22:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:06.706 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:22:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:06.707 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:22:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:06.708 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.035 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.036 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.100 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.158 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.261 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:22:07 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:07.261 219384 INFO neutron.agent.linux.ip_lib [None req-3693cd67-2c64-4ba9-8cb2-3ee540326b69 - - - - - -] Device tap2ad1625b-08 cannot be used as it has no MAC address#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.332 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.335 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.335 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.336 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:07 localhost kernel: device tap2ad1625b-08 entered promiscuous mode Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.338 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:07 localhost NetworkManager[5965]: [1765016527.3393] manager: (tap2ad1625b-08): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.339 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:07 localhost ovn_controller[131684]: 2025-12-06T10:22:07Z|00266|binding|INFO|Claiming lport 2ad1625b-0871-4fee-9bb7-395a7ec28d34 for this chassis. Dec 6 05:22:07 localhost ovn_controller[131684]: 2025-12-06T10:22:07Z|00267|binding|INFO|2ad1625b-0871-4fee-9bb7-395a7ec28d34: Claiming unknown Dec 6 05:22:07 localhost systemd-udevd[255192]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.357 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b4d36725-9df5-4789-a6cf-703223d9fb93', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b4d36725-9df5-4789-a6cf-703223d9fb93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0106a81ef74b47c7a82388dd897a85a4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91d1dfae-3b5e-40e2-8cac-b30bcb57dbe4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2ad1625b-0871-4fee-9bb7-395a7ec28d34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.360 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2ad1625b-0871-4fee-9bb7-395a7ec28d34 in datapath b4d36725-9df5-4789-a6cf-703223d9fb93 bound to our chassis#033[00m Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.362 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b4d36725-9df5-4789-a6cf-703223d9fb93 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.366 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[1684e111-cea8-443f-a063-365995027dcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.376 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.376 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 05:22:07 localhost ovn_controller[131684]: 2025-12-06T10:22:07Z|00268|binding|INFO|Setting lport 2ad1625b-0871-4fee-9bb7-395a7ec28d34 ovn-installed in OVS Dec 6 05:22:07 localhost ovn_controller[131684]: 2025-12-06T10:22:07Z|00269|binding|INFO|Setting lport 2ad1625b-0871-4fee-9bb7-395a7ec28d34 up in Southbound Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.385 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.405 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.434 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.464 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:07 localhost podman[255207]: 2025-12-06 10:22:07.526052681 +0000 UTC m=+0.067358155 container kill 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:07 localhost dnsmasq[255044]: exiting on receipt of SIGTERM Dec 6 05:22:07 localhost systemd[1]: libpod-4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4.scope: Deactivated successfully. Dec 6 05:22:07 localhost podman[255224]: 2025-12-06 10:22:07.606931722 +0000 UTC m=+0.064161777 container died 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:22:07 localhost systemd[1]: tmp-crun.BzhybL.mount: Deactivated successfully. Dec 6 05:22:07 localhost podman[255224]: 2025-12-06 10:22:07.658124649 +0000 UTC m=+0.115354704 container cleanup 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:22:07 localhost systemd[1]: libpod-conmon-4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4.scope: Deactivated successfully. Dec 6 05:22:07 localhost podman[255227]: 2025-12-06 10:22:07.712204915 +0000 UTC m=+0.149129265 container remove 4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:22:07 localhost kernel: device tap44ac8284-a9 left promiscuous mode Dec 6 05:22:07 localhost ovn_controller[131684]: 2025-12-06T10:22:07Z|00270|binding|INFO|Releasing lport 44ac8284-a9aa-4cf4-a200-1a17d49f236d from this chassis (sb_readonly=0) Dec 6 05:22:07 localhost ovn_controller[131684]: 2025-12-06T10:22:07Z|00271|binding|INFO|Setting lport 44ac8284-a9aa-4cf4-a200-1a17d49f236d down in Southbound Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.728 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.739 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe85:cd14/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=44ac8284-a9aa-4cf4-a200-1a17d49f236d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.742 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 44ac8284-a9aa-4cf4-a200-1a17d49f236d in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.746 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:07.747 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5527706b-68cf-4d19-ab12-a32ebcddd336]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:07 localhost nova_compute[237281]: 2025-12-06 10:22:07.750 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:07 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:07.984 2 INFO neutron.agent.securitygroups_rpc [None req-fc66e946-96a6-4aa2-98af-1af48b887e05 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:07 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:07.989 2 INFO neutron.agent.securitygroups_rpc [None req-bfe96b99-4c57-4196-ac1e-ce1658849555 52b198c8ea4742deba2a478ece85b72b 0106a81ef74b47c7a82388dd897a85a4 - - default default] Security group member updated ['790ccf2a-2599-418b-be2d-e766c465dbc3']#033[00m Dec 6 05:22:08 localhost systemd[1]: var-lib-containers-storage-overlay-75eb76d70806674c41da1a0bad7f6e4f69b19201d7a7e42adde5fef4842fca0a-merged.mount: Deactivated successfully. Dec 6 05:22:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d27fd969f8c8a1d7fc2180f7e423d992e5f21e5a5ec1d2a1e9e1c9be6c41db4-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:08 localhost podman[255298]: Dec 6 05:22:08 localhost podman[255298]: 2025-12-06 10:22:08.397830161 +0000 UTC m=+0.102025053 container create 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:22:08 localhost systemd[1]: Started libpod-conmon-5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9.scope. Dec 6 05:22:08 localhost podman[255298]: 2025-12-06 10:22:08.343580751 +0000 UTC m=+0.047775673 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:08 localhost systemd[1]: Started libcrun container. Dec 6 05:22:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/610cb072ba532162b9af2660eb4c54f2c3bbf9ef5f27f5ca41ccf286e833d8c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:08 localhost podman[255298]: 2025-12-06 10:22:08.464250597 +0000 UTC m=+0.168445489 container init 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:08 localhost podman[255298]: 2025-12-06 10:22:08.478291389 +0000 UTC m=+0.182486301 container start 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:22:08 localhost dnsmasq[255317]: started, version 2.85 cachesize 150 Dec 6 05:22:08 localhost dnsmasq[255317]: DNS service limited to local subnets Dec 6 05:22:08 localhost dnsmasq[255317]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:08 localhost dnsmasq[255317]: warning: no upstream servers configured Dec 6 05:22:08 localhost dnsmasq-dhcp[255317]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:22:08 localhost dnsmasq[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/addn_hosts - 0 addresses Dec 6 05:22:08 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/host Dec 6 05:22:08 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/opts Dec 6 05:22:08 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:08.541 219384 INFO neutron.agent.dhcp.agent [None req-28385ede-a159-47c1-9f39-f3da6fbdb609 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:08 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:08.544 219384 INFO neutron.agent.dhcp.agent [None req-3693cd67-2c64-4ba9-8cb2-3ee540326b69 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:07Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b229f75d-23b9-4cf6-883c-8c72a1fda733, ip_allocation=immediate, mac_address=fa:16:3e:3e:83:ab, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1294717467, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:02Z, description=, dns_domain=, id=b4d36725-9df5-4789-a6cf-703223d9fb93, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-748050176, port_security_enabled=True, project_id=0106a81ef74b47c7a82388dd897a85a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1519, status=ACTIVE, subnets=['aaa6dca9-4ebd-43e6-99c2-41dbe31cc138'], tags=[], tenant_id=0106a81ef74b47c7a82388dd897a85a4, updated_at=2025-12-06T10:22:05Z, vlan_transparent=None, network_id=b4d36725-9df5-4789-a6cf-703223d9fb93, port_security_enabled=True, project_id=0106a81ef74b47c7a82388dd897a85a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['790ccf2a-2599-418b-be2d-e766c465dbc3'], standard_attr_id=1537, status=DOWN, tags=[], tenant_id=0106a81ef74b47c7a82388dd897a85a4, updated_at=2025-12-06T10:22:07Z on network b4d36725-9df5-4789-a6cf-703223d9fb93#033[00m Dec 6 05:22:08 localhost nova_compute[237281]: 2025-12-06 10:22:08.659 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:08 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:08.718 219384 INFO neutron.agent.dhcp.agent [None req-77626dbe-4db1-4efa-a228-d0c30016bf0a - - - - - -] DHCP configuration for ports {'cea53a23-b49d-4dae-afba-ee25c179ebfb'} is completed#033[00m Dec 6 05:22:08 localhost dnsmasq[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/addn_hosts - 1 addresses Dec 6 05:22:08 localhost podman[255336]: 2025-12-06 10:22:08.771669494 +0000 UTC m=+0.063764615 container kill 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:22:08 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/host Dec 6 05:22:08 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/opts Dec 6 05:22:09 localhost systemd[1]: tmp-crun.Soi7EA.mount: Deactivated successfully. Dec 6 05:22:09 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:22:09 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:09.314 219384 INFO neutron.agent.dhcp.agent [None req-b1dc6f58-8a3a-4d3b-a86f-28f56e22a48c - - - - - -] DHCP configuration for ports {'b229f75d-23b9-4cf6-883c-8c72a1fda733'} is completed#033[00m Dec 6 05:22:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14217 DF PROTO=TCP SPT=34126 DPT=9102 SEQ=880347751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD2B870000000001030307) Dec 6 05:22:10 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:10.041 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:10 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:10.043 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:22:10 localhost nova_compute[237281]: 2025-12-06 10:22:10.088 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:10 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:10.090 2 INFO neutron.agent.securitygroups_rpc [None req-c61335ed-e4f7-4ee8-9827-57d57963bc8e 52b198c8ea4742deba2a478ece85b72b 0106a81ef74b47c7a82388dd897a85a4 - - default default] Security group member updated ['790ccf2a-2599-418b-be2d-e766c465dbc3']#033[00m Dec 6 05:22:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:10.320 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:09Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=c741747c-587d-4f09-84bb-60a0f91f5840, ip_allocation=immediate, mac_address=fa:16:3e:f9:a9:b7, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1618021763, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:02Z, description=, dns_domain=, id=b4d36725-9df5-4789-a6cf-703223d9fb93, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-748050176, port_security_enabled=True, project_id=0106a81ef74b47c7a82388dd897a85a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1519, status=ACTIVE, subnets=['aaa6dca9-4ebd-43e6-99c2-41dbe31cc138'], tags=[], tenant_id=0106a81ef74b47c7a82388dd897a85a4, updated_at=2025-12-06T10:22:05Z, vlan_transparent=None, network_id=b4d36725-9df5-4789-a6cf-703223d9fb93, port_security_enabled=True, project_id=0106a81ef74b47c7a82388dd897a85a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['790ccf2a-2599-418b-be2d-e766c465dbc3'], standard_attr_id=1543, status=DOWN, tags=[], tenant_id=0106a81ef74b47c7a82388dd897a85a4, updated_at=2025-12-06T10:22:09Z on network b4d36725-9df5-4789-a6cf-703223d9fb93#033[00m Dec 6 05:22:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:10.337 219384 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Dec 6 05:22:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:10.338 219384 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Dec 6 05:22:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:10.338 219384 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Dec 6 05:22:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:22:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:22:10 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:10.473 2 INFO neutron.agent.securitygroups_rpc [None req-fca92ec1-aaf9-4d25-917d-173abbf6a48f 846cb84ebf0c4340865d754a154550aa 606d49ab85ab4a8096bd0ae42ab3aae9 - - default default] Security group member updated ['0cb0d4f7-ca05-4a96-8a43-bcaf19afc97d']#033[00m Dec 6 05:22:10 localhost podman[255376]: 2025-12-06 10:22:10.512875512 +0000 UTC m=+0.067112438 container kill 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:22:10 localhost dnsmasq[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/addn_hosts - 2 addresses Dec 6 05:22:10 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/host Dec 6 05:22:10 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/opts Dec 6 05:22:10 localhost podman[255382]: 2025-12-06 10:22:10.558062144 +0000 UTC m=+0.088363612 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:10 localhost podman[255382]: 2025-12-06 10:22:10.593392842 +0000 UTC m=+0.123694340 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 05:22:10 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:22:10 localhost podman[255386]: 2025-12-06 10:22:10.669185927 +0000 UTC m=+0.196442301 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 05:22:10 localhost podman[255386]: 2025-12-06 10:22:10.707293711 +0000 UTC m=+0.234550175 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 05:22:10 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:22:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:10.817 219384 INFO neutron.agent.dhcp.agent [None req-a8938582-caa7-4a36-8711-158e42c609ce - - - - - -] DHCP configuration for ports {'c741747c-587d-4f09-84bb-60a0f91f5840'} is completed#033[00m Dec 6 05:22:10 localhost nova_compute[237281]: 2025-12-06 10:22:10.946 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:12 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:12.263 2 INFO neutron.agent.securitygroups_rpc [None req-2fe0e904-9183-4779-a7c9-3c98d40537cd 52b198c8ea4742deba2a478ece85b72b 0106a81ef74b47c7a82388dd897a85a4 - - default default] Security group member updated ['790ccf2a-2599-418b-be2d-e766c465dbc3']#033[00m Dec 6 05:22:12 localhost dnsmasq[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/addn_hosts - 1 addresses Dec 6 05:22:12 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/host Dec 6 05:22:12 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/opts Dec 6 05:22:12 localhost podman[255452]: 2025-12-06 10:22:12.500775788 +0000 UTC m=+0.061699892 container kill 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:22:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:13.231 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:07Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=b229f75d-23b9-4cf6-883c-8c72a1fda733, ip_allocation=immediate, mac_address=fa:16:3e:3e:83:ab, name=tempest-new-port-name-148287439, network_id=b4d36725-9df5-4789-a6cf-703223d9fb93, port_security_enabled=True, project_id=0106a81ef74b47c7a82388dd897a85a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['790ccf2a-2599-418b-be2d-e766c465dbc3'], standard_attr_id=1537, status=DOWN, tags=[], tenant_id=0106a81ef74b47c7a82388dd897a85a4, updated_at=2025-12-06T10:22:13Z on network b4d36725-9df5-4789-a6cf-703223d9fb93#033[00m Dec 6 05:22:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:13.249 219384 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Dec 6 05:22:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:13.250 219384 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Dec 6 05:22:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:13.250 219384 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Dec 6 05:22:13 localhost dnsmasq[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/addn_hosts - 1 addresses Dec 6 05:22:13 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/host Dec 6 05:22:13 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/opts Dec 6 05:22:13 localhost podman[255490]: 2025-12-06 10:22:13.408323449 +0000 UTC m=+0.050343171 container kill 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:22:13 localhost nova_compute[237281]: 2025-12-06 10:22:13.662 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:13 localhost nova_compute[237281]: 2025-12-06 10:22:13.797 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:13.973 219384 INFO neutron.agent.dhcp.agent [None req-4f305ba5-a180-4f68-a140-26a638b8fee6 - - - - - -] DHCP configuration for ports {'b229f75d-23b9-4cf6-883c-8c72a1fda733'} is completed#033[00m Dec 6 05:22:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:14.045 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:22:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:14.081 219384 INFO neutron.agent.linux.ip_lib [None req-d44bcce9-ffdf-4a25-9b29-4682ea03bf19 - - - - - -] Device tap1c67e6c6-c3 cannot be used as it has no MAC address#033[00m Dec 6 05:22:14 localhost nova_compute[237281]: 2025-12-06 10:22:14.098 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:14 localhost kernel: device tap1c67e6c6-c3 entered promiscuous mode Dec 6 05:22:14 localhost NetworkManager[5965]: [1765016534.1048] manager: (tap1c67e6c6-c3): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Dec 6 05:22:14 localhost nova_compute[237281]: 2025-12-06 10:22:14.105 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:14 localhost ovn_controller[131684]: 2025-12-06T10:22:14Z|00272|binding|INFO|Claiming lport 1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2 for this chassis. Dec 6 05:22:14 localhost ovn_controller[131684]: 2025-12-06T10:22:14Z|00273|binding|INFO|1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2: Claiming unknown Dec 6 05:22:14 localhost systemd-udevd[255521]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:22:14 localhost ovn_controller[131684]: 2025-12-06T10:22:14Z|00274|binding|INFO|Setting lport 1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2 ovn-installed in OVS Dec 6 05:22:14 localhost nova_compute[237281]: 2025-12-06 10:22:14.117 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:14 localhost ovn_controller[131684]: 2025-12-06T10:22:14Z|00275|binding|INFO|Setting lport 1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2 up in Southbound Dec 6 05:22:14 localhost nova_compute[237281]: 2025-12-06 10:22:14.123 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:14.123 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec0:6b4d/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:14.126 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:22:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:14.128 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port f759d271-a474-4eae-9b3f-688f9b32e380 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:22:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:14.129 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:14.129 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d2330ccd-2107-4a27-ac68-7cf67d9e9f92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:14 localhost nova_compute[237281]: 2025-12-06 10:22:14.148 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:14 localhost nova_compute[237281]: 2025-12-06 10:22:14.176 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:14 localhost nova_compute[237281]: 2025-12-06 10:22:14.206 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:14 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:14.737 2 INFO neutron.agent.securitygroups_rpc [None req-8d1b2582-3d5a-4297-824d-4ec83b23e1a0 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:15 localhost podman[255576]: Dec 6 05:22:15 localhost podman[255576]: 2025-12-06 10:22:15.023933049 +0000 UTC m=+0.081938275 container create 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:15 localhost systemd[1]: Started libpod-conmon-3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2.scope. Dec 6 05:22:15 localhost podman[255576]: 2025-12-06 10:22:14.977066815 +0000 UTC m=+0.035071991 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:15 localhost systemd[1]: Started libcrun container. Dec 6 05:22:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60627a329cf53fddbcca048bf324a2d652fdbf5c1b3c83fc17324bf1c3fc914e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:15 localhost podman[255576]: 2025-12-06 10:22:15.108308227 +0000 UTC m=+0.166313383 container init 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:22:15 localhost podman[255576]: 2025-12-06 10:22:15.119038648 +0000 UTC m=+0.177043854 container start 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:15 localhost dnsmasq[255594]: started, version 2.85 cachesize 150 Dec 6 05:22:15 localhost dnsmasq[255594]: DNS service limited to local subnets Dec 6 05:22:15 localhost dnsmasq[255594]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:15 localhost dnsmasq[255594]: warning: no upstream servers configured Dec 6 05:22:15 localhost dnsmasq[255594]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:15.182 219384 INFO neutron.agent.dhcp.agent [None req-d44bcce9-ffdf-4a25-9b29-4682ea03bf19 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:13Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=84242d63-2e00-4083-8279-0798e5f9e9f8, ip_allocation=immediate, mac_address=fa:16:3e:49:cb:42, name=tempest-NetworksTestDHCPv6-778431530, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['fea8544c-e532-4fb4-8ac2-df40e3071bcd'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:10Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1569, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:13Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:22:15 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:15.354 2 INFO neutron.agent.securitygroups_rpc [None req-fb35731e-8499-4f9f-8cb7-45955e551bcf 52b198c8ea4742deba2a478ece85b72b 0106a81ef74b47c7a82388dd897a85a4 - - default default] Security group member updated ['790ccf2a-2599-418b-be2d-e766c465dbc3']#033[00m Dec 6 05:22:15 localhost dnsmasq[255594]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:22:15 localhost podman[255612]: 2025-12-06 10:22:15.373703761 +0000 UTC m=+0.059098081 container kill 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:22:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:15.397 219384 INFO neutron.agent.dhcp.agent [None req-01960f9d-32b0-4033-91dd-56518019d951 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:22:15 localhost podman[255652]: 2025-12-06 10:22:15.577644812 +0000 UTC m=+0.058187093 container kill 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:22:15 localhost dnsmasq[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/addn_hosts - 0 addresses Dec 6 05:22:15 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/host Dec 6 05:22:15 localhost dnsmasq-dhcp[255317]: read /var/lib/neutron/dhcp/b4d36725-9df5-4789-a6cf-703223d9fb93/opts Dec 6 05:22:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:15.579 219384 INFO neutron.agent.linux.ip_lib [None req-4c66c59e-0d99-473a-abef-8d79cd0ef5db - - - - - -] Device tapfd3bc1cb-d1 cannot be used as it has no MAC address#033[00m Dec 6 05:22:15 localhost nova_compute[237281]: 2025-12-06 10:22:15.644 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:15 localhost kernel: device tapfd3bc1cb-d1 entered promiscuous mode Dec 6 05:22:15 localhost ovn_controller[131684]: 2025-12-06T10:22:15Z|00276|binding|INFO|Claiming lport fd3bc1cb-d1d9-4949-8206-e60804b5fbba for this chassis. Dec 6 05:22:15 localhost ovn_controller[131684]: 2025-12-06T10:22:15Z|00277|binding|INFO|fd3bc1cb-d1d9-4949-8206-e60804b5fbba: Claiming unknown Dec 6 05:22:15 localhost systemd-udevd[255523]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:22:15 localhost NetworkManager[5965]: [1765016535.6508] manager: (tapfd3bc1cb-d1): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Dec 6 05:22:15 localhost ovn_controller[131684]: 2025-12-06T10:22:15Z|00278|binding|INFO|Setting lport fd3bc1cb-d1d9-4949-8206-e60804b5fbba ovn-installed in OVS Dec 6 05:22:15 localhost ovn_controller[131684]: 2025-12-06T10:22:15Z|00279|binding|INFO|Setting lport fd3bc1cb-d1d9-4949-8206-e60804b5fbba up in Southbound Dec 6 05:22:15 localhost nova_compute[237281]: 2025-12-06 10:22:15.670 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:15 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:15.670 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f3835a8c-8fc4-4df4-8e9e-177c5351209d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3835a8c-8fc4-4df4-8e9e-177c5351209d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0901295e4a3a44e89ad3e6a450608d11', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a412acf0-146c-4d63-8540-29f9f589a7a8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fd3bc1cb-d1d9-4949-8206-e60804b5fbba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:15 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:15.674 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fd3bc1cb-d1d9-4949-8206-e60804b5fbba in datapath f3835a8c-8fc4-4df4-8e9e-177c5351209d bound to our chassis#033[00m Dec 6 05:22:15 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:15.675 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f3835a8c-8fc4-4df4-8e9e-177c5351209d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:22:15 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:15.676 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[893becea-9529-421e-af31-79495a07107f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:15 localhost nova_compute[237281]: 2025-12-06 10:22:15.691 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:15 localhost nova_compute[237281]: 2025-12-06 10:22:15.738 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:15.744 219384 INFO neutron.agent.dhcp.agent [None req-cc901be4-a2eb-441c-b8ac-f0d1dac96298 - - - - - -] DHCP configuration for ports {'84242d63-2e00-4083-8279-0798e5f9e9f8'} is completed#033[00m Dec 6 05:22:15 localhost nova_compute[237281]: 2025-12-06 10:22:15.771 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:15 localhost nova_compute[237281]: 2025-12-06 10:22:15.949 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:16 localhost openstack_network_exporter[199751]: ERROR 10:22:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:22:16 localhost openstack_network_exporter[199751]: ERROR 10:22:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:22:16 localhost openstack_network_exporter[199751]: ERROR 10:22:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:22:16 localhost openstack_network_exporter[199751]: ERROR 10:22:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:22:16 localhost openstack_network_exporter[199751]: Dec 6 05:22:16 localhost openstack_network_exporter[199751]: ERROR 10:22:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:22:16 localhost openstack_network_exporter[199751]: Dec 6 05:22:16 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:16.332 2 INFO neutron.agent.securitygroups_rpc [None req-bd22d045-d41e-4840-aa64-64a391991850 f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:22:16 localhost podman[255733]: Dec 6 05:22:16 localhost podman[255733]: 2025-12-06 10:22:16.654678454 +0000 UTC m=+0.092092897 container create c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:16 localhost systemd[1]: Started libpod-conmon-c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433.scope. Dec 6 05:22:16 localhost podman[255733]: 2025-12-06 10:22:16.60846275 +0000 UTC m=+0.045877223 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:16 localhost systemd[1]: Started libcrun container. Dec 6 05:22:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b97c666f75e0aed47a8cc1a8129fae1d45a0080331e814cc0d2694e80236248/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:16 localhost podman[255733]: 2025-12-06 10:22:16.727246069 +0000 UTC m=+0.164660522 container init c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:16 localhost podman[255733]: 2025-12-06 10:22:16.736178514 +0000 UTC m=+0.173592987 container start c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:22:16 localhost dnsmasq[255764]: started, version 2.85 cachesize 150 Dec 6 05:22:16 localhost dnsmasq[255764]: DNS service limited to local subnets Dec 6 05:22:16 localhost dnsmasq[255764]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:16 localhost dnsmasq[255764]: warning: no upstream servers configured Dec 6 05:22:16 localhost dnsmasq-dhcp[255764]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:22:16 localhost dnsmasq[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/addn_hosts - 0 addresses Dec 6 05:22:16 localhost dnsmasq-dhcp[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/host Dec 6 05:22:16 localhost dnsmasq-dhcp[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/opts Dec 6 05:22:16 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:16.802 219384 INFO neutron.agent.dhcp.agent [None req-4c66c59e-0d99-473a-abef-8d79cd0ef5db - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:15Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b66b5f5-ca68-4f5b-b162-d5962234c6c6, ip_allocation=immediate, mac_address=fa:16:3e:c8:7d:73, name=tempest-PortsIpV6TestJSON-1153131399, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:11Z, description=, dns_domain=, id=f3835a8c-8fc4-4df4-8e9e-177c5351209d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1279053292, port_security_enabled=True, project_id=0901295e4a3a44e89ad3e6a450608d11, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11963, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1558, status=ACTIVE, subnets=['8b685f21-e8ff-4c9d-af75-340a2ed3159f'], tags=[], tenant_id=0901295e4a3a44e89ad3e6a450608d11, updated_at=2025-12-06T10:22:13Z, vlan_transparent=None, network_id=f3835a8c-8fc4-4df4-8e9e-177c5351209d, port_security_enabled=True, project_id=0901295e4a3a44e89ad3e6a450608d11, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1573, status=DOWN, tags=[], tenant_id=0901295e4a3a44e89ad3e6a450608d11, updated_at=2025-12-06T10:22:15Z on network f3835a8c-8fc4-4df4-8e9e-177c5351209d#033[00m Dec 6 05:22:16 localhost dnsmasq[255317]: exiting on receipt of SIGTERM Dec 6 05:22:16 localhost podman[255769]: 2025-12-06 10:22:16.872934896 +0000 UTC m=+0.067608574 container kill 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:22:16 localhost systemd[1]: libpod-5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9.scope: Deactivated successfully. Dec 6 05:22:16 localhost podman[255797]: 2025-12-06 10:22:16.937860206 +0000 UTC m=+0.055205152 container died 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:22:16 localhost dnsmasq[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/addn_hosts - 1 addresses Dec 6 05:22:16 localhost dnsmasq-dhcp[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/host Dec 6 05:22:16 localhost dnsmasq-dhcp[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/opts Dec 6 05:22:16 localhost podman[255819]: 2025-12-06 10:22:16.986425541 +0000 UTC m=+0.041160149 container kill c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 05:22:17 localhost podman[255797]: 2025-12-06 10:22:17.023301547 +0000 UTC m=+0.140646473 container cleanup 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:22:17 localhost systemd[1]: var-lib-containers-storage-overlay-610cb072ba532162b9af2660eb4c54f2c3bbf9ef5f27f5ca41ccf286e833d8c1-merged.mount: Deactivated successfully. Dec 6 05:22:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:17 localhost systemd[1]: libpod-conmon-5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9.scope: Deactivated successfully. Dec 6 05:22:17 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:17.033 219384 INFO neutron.agent.dhcp.agent [None req-d35f3498-48fe-4118-89b9-110daf23c655 - - - - - -] DHCP configuration for ports {'1cbbed7e-f396-45d9-8df7-9400fefdab52'} is completed#033[00m Dec 6 05:22:17 localhost podman[255804]: 2025-12-06 10:22:17.047992318 +0000 UTC m=+0.152319522 container remove 5e0e152464fd8ec08f3374dac48df02febdd4946e59f4ba103d5d554111174b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b4d36725-9df5-4789-a6cf-703223d9fb93, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:17 localhost kernel: device tap2ad1625b-08 left promiscuous mode Dec 6 05:22:17 localhost nova_compute[237281]: 2025-12-06 10:22:17.060 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:17 localhost ovn_controller[131684]: 2025-12-06T10:22:17Z|00280|binding|INFO|Releasing lport 2ad1625b-0871-4fee-9bb7-395a7ec28d34 from this chassis (sb_readonly=0) Dec 6 05:22:17 localhost ovn_controller[131684]: 2025-12-06T10:22:17Z|00281|binding|INFO|Setting lport 2ad1625b-0871-4fee-9bb7-395a7ec28d34 down in Southbound Dec 6 05:22:17 localhost nova_compute[237281]: 2025-12-06 10:22:17.076 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:17.076 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b4d36725-9df5-4789-a6cf-703223d9fb93', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b4d36725-9df5-4789-a6cf-703223d9fb93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0106a81ef74b47c7a82388dd897a85a4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91d1dfae-3b5e-40e2-8cac-b30bcb57dbe4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2ad1625b-0871-4fee-9bb7-395a7ec28d34) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:17.078 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 2ad1625b-0871-4fee-9bb7-395a7ec28d34 in datapath b4d36725-9df5-4789-a6cf-703223d9fb93 unbound from our chassis#033[00m Dec 6 05:22:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:17.078 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b4d36725-9df5-4789-a6cf-703223d9fb93 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:22:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:17.079 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f963df-807f-4832-948e-35528bf59e49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:18.216 219384 INFO neutron.agent.dhcp.agent [None req-6059a4c5-bef1-40cb-b043-7ac36a16add3 - - - - - -] DHCP configuration for ports {'7b66b5f5-ca68-4f5b-b162-d5962234c6c6'} is completed#033[00m Dec 6 05:22:18 localhost dnsmasq[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/addn_hosts - 0 addresses Dec 6 05:22:18 localhost dnsmasq-dhcp[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/host Dec 6 05:22:18 localhost dnsmasq-dhcp[255764]: read /var/lib/neutron/dhcp/f3835a8c-8fc4-4df4-8e9e-177c5351209d/opts Dec 6 05:22:18 localhost podman[255863]: 2025-12-06 10:22:18.280950492 +0000 UTC m=+0.064221169 container kill c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:18.532 219384 INFO neutron.agent.dhcp.agent [None req-0724cc59-17c8-41d8-a9a1-f33649c975de - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:18 localhost systemd[1]: run-netns-qdhcp\x2db4d36725\x2d9df5\x2d4789\x2da6cf\x2d703223d9fb93.mount: Deactivated successfully. Dec 6 05:22:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:18.533 219384 INFO neutron.agent.dhcp.agent [None req-0724cc59-17c8-41d8-a9a1-f33649c975de - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:18 localhost nova_compute[237281]: 2025-12-06 10:22:18.706 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:18.846 2 INFO neutron.agent.securitygroups_rpc [None req-4419401e-8576-448b-9bcb-c4f76fbc922c a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:18.877 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:19 localhost podman[255900]: 2025-12-06 10:22:19.052422863 +0000 UTC m=+0.036344830 container kill 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:19 localhost dnsmasq[255594]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:22:19 localhost ovn_controller[131684]: 2025-12-06T10:22:19Z|00282|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:22:19 localhost nova_compute[237281]: 2025-12-06 10:22:19.519 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:19 localhost podman[255920]: 2025-12-06 10:22:19.555997202 +0000 UTC m=+0.086231166 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.) Dec 6 05:22:19 localhost podman[255920]: 2025-12-06 10:22:19.573289186 +0000 UTC m=+0.103523100 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 05:22:19 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:22:20 localhost nova_compute[237281]: 2025-12-06 10:22:20.984 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:21 localhost dnsmasq[255764]: exiting on receipt of SIGTERM Dec 6 05:22:21 localhost podman[255956]: 2025-12-06 10:22:21.159503828 +0000 UTC m=+0.057237484 container kill c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:22:21 localhost systemd[1]: libpod-c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433.scope: Deactivated successfully. Dec 6 05:22:21 localhost podman[255968]: 2025-12-06 10:22:21.226259675 +0000 UTC m=+0.055196792 container died c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:22:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:21 localhost podman[255968]: 2025-12-06 10:22:21.2618366 +0000 UTC m=+0.090773677 container cleanup c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:21 localhost systemd[1]: libpod-conmon-c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433.scope: Deactivated successfully. Dec 6 05:22:21 localhost podman[255976]: 2025-12-06 10:22:21.284072714 +0000 UTC m=+0.100464965 container remove c3b913870cca22e3a98b7b8c83d5214d953f0db46132d3f781955d1b905e0433 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f3835a8c-8fc4-4df4-8e9e-177c5351209d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:22:21 localhost kernel: device tapfd3bc1cb-d1 left promiscuous mode Dec 6 05:22:21 localhost nova_compute[237281]: 2025-12-06 10:22:21.294 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:21 localhost ovn_controller[131684]: 2025-12-06T10:22:21Z|00283|binding|INFO|Releasing lport fd3bc1cb-d1d9-4949-8206-e60804b5fbba from this chassis (sb_readonly=0) Dec 6 05:22:21 localhost ovn_controller[131684]: 2025-12-06T10:22:21Z|00284|binding|INFO|Setting lport fd3bc1cb-d1d9-4949-8206-e60804b5fbba down in Southbound Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.307 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f3835a8c-8fc4-4df4-8e9e-177c5351209d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3835a8c-8fc4-4df4-8e9e-177c5351209d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0901295e4a3a44e89ad3e6a450608d11', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a412acf0-146c-4d63-8540-29f9f589a7a8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fd3bc1cb-d1d9-4949-8206-e60804b5fbba) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.308 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fd3bc1cb-d1d9-4949-8206-e60804b5fbba in datapath f3835a8c-8fc4-4df4-8e9e-177c5351209d unbound from our chassis#033[00m Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.308 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f3835a8c-8fc4-4df4-8e9e-177c5351209d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.309 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[3a82a82e-e67d-44a6-8d3e-f36609f8e884]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:21 localhost nova_compute[237281]: 2025-12-06 10:22:21.313 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:21 localhost nova_compute[237281]: 2025-12-06 10:22:21.314 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:21 localhost dnsmasq[255594]: exiting on receipt of SIGTERM Dec 6 05:22:21 localhost podman[256018]: 2025-12-06 10:22:21.58944264 +0000 UTC m=+0.059437282 container kill 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:22:21 localhost systemd[1]: libpod-3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2.scope: Deactivated successfully. Dec 6 05:22:21 localhost podman[256031]: 2025-12-06 10:22:21.659168707 +0000 UTC m=+0.056058367 container died 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:22:21 localhost podman[256031]: 2025-12-06 10:22:21.69010284 +0000 UTC m=+0.086992480 container cleanup 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:22:21 localhost systemd[1]: libpod-conmon-3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2.scope: Deactivated successfully. Dec 6 05:22:21 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:21.717 219384 INFO neutron.agent.dhcp.agent [None req-53a5e5fe-3193-4c2e-955f-d320b05ec0e8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:21 localhost podman[256033]: 2025-12-06 10:22:21.752154181 +0000 UTC m=+0.141060405 container remove 3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:22:21 localhost nova_compute[237281]: 2025-12-06 10:22:21.765 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:21 localhost kernel: device tap1c67e6c6-c3 left promiscuous mode Dec 6 05:22:21 localhost ovn_controller[131684]: 2025-12-06T10:22:21Z|00285|binding|INFO|Releasing lport 1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2 from this chassis (sb_readonly=0) Dec 6 05:22:21 localhost ovn_controller[131684]: 2025-12-06T10:22:21Z|00286|binding|INFO|Setting lport 1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2 down in Southbound Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.789 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec0:6b4d/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.791 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 1c67e6c6-c3eb-4d27-ae32-270ccbf6baf2 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.794 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:21.795 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a87fec6a-97c2-407a-8f22-48abcabe16a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:21 localhost nova_compute[237281]: 2025-12-06 10:22:21.811 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:22 localhost systemd[1]: var-lib-containers-storage-overlay-8b97c666f75e0aed47a8cc1a8129fae1d45a0080331e814cc0d2694e80236248-merged.mount: Deactivated successfully. Dec 6 05:22:22 localhost systemd[1]: run-netns-qdhcp\x2df3835a8c\x2d8fc4\x2d4df4\x2d8e9e\x2d177c5351209d.mount: Deactivated successfully. Dec 6 05:22:22 localhost systemd[1]: var-lib-containers-storage-overlay-60627a329cf53fddbcca048bf324a2d652fdbf5c1b3c83fc17324bf1c3fc914e-merged.mount: Deactivated successfully. Dec 6 05:22:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3444f3b47eccca43d61e15a1b4855090312b35b88c51df3327d93224daa096a2-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:22.207 219384 INFO neutron.agent.dhcp.agent [None req-1caa80fa-0931-4b7e-ad55-dfc734849a1e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:22 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:22:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:22.397 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:22.997 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:22:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:22.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.000 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a3b72e5-1403-4b19-b445-6588c021267e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:22.998037', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '74300838-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': '8b9eb86b240c02f90beeea8b4480e857007d1806f54a3c1beee4587a50b1319f'}]}, 'timestamp': '2025-12-06 10:22:23.001366', '_unique_id': '87becfe7d0a843599ca63f62a0040677'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.040 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.040 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a47ecbd-35dc-4166-b768-b59acbf37f95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.002963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74360d14-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': '6cd04135676bbcb84126d5be65fb06872caebb44ae8461bb941a8937f05739a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.002963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '743621fa-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': '6329e0f1d84ce940e8b09269b5c80e2849d10dbf62af5be4883d5be78e4b4e85'}]}, 'timestamp': '2025-12-06 10:22:23.041491', '_unique_id': '6e32f638342d4b739febd8cbe2d8f27b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.042 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.044 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.044 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a65075e9-1ada-49e6-a69d-c7ccdc4e03e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.044325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7436a68e-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': 'a457c54c69048f72a27c0822e966927897463b3bfc6aa488ca7da4342e381a12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.044325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7436b8fe-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': 'e5bfa27aaa621d59f992160963643019c780d80d8c3dd98ae4d1568365d5f76a'}]}, 'timestamp': '2025-12-06 10:22:23.045257', '_unique_id': '4c8f789812db4cfea0a646d39c74777c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.046 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.048 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8df5ad94-2546-4c4a-b3a2-83b862ff4451', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.048161', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '74373e8c-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': 'fe6c5f612dd2793c61608cdfb3b761e6dfe7ba6ad605b289c3bad59b02d831da'}]}, 'timestamp': '2025-12-06 10:22:23.048777', '_unique_id': '4238da84bfac4539b90e2e6f9c016720'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.051 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.069 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e0fc2e-2495-4f14-8847-d4b3ddaa688f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:22:23.051215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '743a7e9e-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.279491402, 'message_signature': 'f55384dc99f81429cc01f5e25e8cd878b7b854d5660c932bb00488c527220b62'}]}, 'timestamp': '2025-12-06 10:22:23.070032', '_unique_id': '61283dac4e884cebb8735464eee7c6e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.072 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.072 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b997f747-6a6e-4e33-aaae-bd9f43a6c6fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.072387', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '743aeea6-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': '337007ab6fb4de63fedce19f1d2d93bf8a18ceb0d45b42744b2e9c852ec1cd08'}]}, 'timestamp': '2025-12-06 10:22:23.072906', '_unique_id': 'ffe775ee944947bbbc355fc5aa22dd0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.073 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.075 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 19300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5ef7907-6b25-497c-a510-cf8eaf438899', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19300000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:22:23.075436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '743b6804-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.279491402, 'message_signature': 'f2ba296b92c67bfc8b1b087561eccdde1b8ddd2a8df221759cfaf8bf0193db03'}]}, 'timestamp': '2025-12-06 10:22:23.076015', '_unique_id': '20793560a9df4c8aaec2078e987289bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.078 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6edcd9f-0937-45d6-90b3-a30a0135515b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.078659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '743be478-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': 'fad0c898f5d850a9d55a92fb272ab6eaba8fb8f44af42b5b2d50f38d02ee4da8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.078659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '743bf58a-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': '277064a6c766efcf7f3cd5183348503304ffbb4a97c8b62292ef8f1678df7005'}]}, 'timestamp': '2025-12-06 10:22:23.079567', '_unique_id': 'ac2bdb10077746cbbfd90d72c7902a2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.082 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a734e8d4-03c2-4408-8fa4-8a50116b6c83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.082216', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '743c7384-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': 'd608c49ec802d997b7ff1f228d40a5cf185d83b2e25ed148c44f5d02b2053dbd'}]}, 'timestamp': '2025-12-06 10:22:23.082827', '_unique_id': '599e5174f55a4c8298953a84880c4201'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1672b625-890c-4d59-be7d-9e46876180c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.085128', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '743ce0b2-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': '04f90afb3d3e898df56a5d77330bd69cc7d1ab6a71a57b6d4e92fc50c38251b4'}]}, 'timestamp': '2025-12-06 10:22:23.085629', '_unique_id': 'be849bdde49e46ccb20f0ddded74b008'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.087 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.104 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e88b4439-7dae-4ea0-827d-6e16d83e78c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.087865', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '743fc7d2-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.297981931, 'message_signature': '67e13117d528c18ca3b48f184fe2b70f03fb5197d859e3cbdac657363e0e38d0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.087865', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '743fddee-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.297981931, 'message_signature': 'ce7a3f925bf37506448fc72c22d4b85c0173346c0f9beaa1fc108a029965e78e'}]}, 'timestamp': '2025-12-06 10:22:23.105184', '_unique_id': '465c4c91b50846bf8826fc000df3999e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.106 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.107 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd819e69d-c9cd-4e3d-9d32-ccc909ac895f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.107560', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '74404cb6-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': 'c6b2182d00f2a6c63e636fc87af8691f78f71eaa2f35036595042768ad80b5f9'}]}, 'timestamp': '2025-12-06 10:22:23.108071', '_unique_id': '5ab4ddefb1ee4bb3ac5f08f434ea08e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.109 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.110 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.110 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76782667-2e56-4272-8a53-2081bfb0b7b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.110354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7440be30-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.297981931, 'message_signature': '1f5363d849da77f35f5021d4aaf7c8ee45ef3d068537407441e5cb5d7e205be8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.110354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7440d276-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.297981931, 'message_signature': '86c7fc88ae4ab0f54e55c2c9c2285b67c54f99861967ff0ee87eceaced67128e'}]}, 'timestamp': '2025-12-06 10:22:23.111441', '_unique_id': '5fce2823ed3d4ee78471dd1bdd2d3b64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.114 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a158469-4b82-47b2-8f67-55e556082efb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.113970', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7441471a-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': 'f8298b5cb0b8e2cec64b795bddd6fb9ff7397b2625876065abedb1c6c3d51d64'}]}, 'timestamp': '2025-12-06 10:22:23.114524', '_unique_id': '7a3b67cc550e47479f0d28e92a94862c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.115 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.116 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac04295-5fed-4d32-a7fc-74b47320bd4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.116781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7441b5f6-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': 'af67b1ca0fbfd80d5262a3d2787db2102a3801188f874a569ccb3cd2f567c130'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.116781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7441c690-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': '6a7ad36b449efe3d920ae254e43ac86757f18b89d33f74c1f03a497962589af4'}]}, 'timestamp': '2025-12-06 10:22:23.117684', '_unique_id': '8bd595ac32ac4a829712b9417592d522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.120 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b40d6aa-db24-4fd9-8ed4-611aab918ef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.119998', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '74423238-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': '62cb2bd6743c27d7a7c926044408e486fb200d95eed1ed9d3118262ded921a1c'}]}, 'timestamp': '2025-12-06 10:22:23.120473', '_unique_id': 'cc2a5116f175448881261bca7e183870'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.121 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.123 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87e37b15-3cd4-46d4-8bec-a41209d82956', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.122899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7442a4ca-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': '9dc99c2cc5cdc42edf31654052beee21624929f2f98780c6e92112774ddb545b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.122899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7442b53c-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': '71886c7459109216e1bd89b62a7eda5a33a2d100c23c0e7f774103ee1ebbf5e6'}]}, 'timestamp': '2025-12-06 10:22:23.123792', '_unique_id': '072a0934fed94325b7d1debe7165714a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.126 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57a56912-4465-4eea-99bc-0822f6b23243', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.126077', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74431f72-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': 'ab7d6dad749021a782e4ad1e264dc56d6f892b1107c0c86144c6fa4e7fb0c705'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.126077', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74433142-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.213036085, 'message_signature': '796560f0d1e63e1c9e981e4fa48e235f72dcc4e27e0e8e74e230bab1f9d0aa80'}]}, 'timestamp': '2025-12-06 10:22:23.127000', '_unique_id': 'e7d889208d8e430f81154f67bdf7073a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.129 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e0331c7-2a71-4241-ba7e-e188e883e699', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.129423', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7443a244-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': 'd807c7dc02c9f69cbf52fe6cf7243ecd40855ee8e59202b2299a96e710a4bf67'}]}, 'timestamp': '2025-12-06 10:22:23.129929', '_unique_id': '0dfcfa801cb647eeaec4fc8e7e14369a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.131 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b34b97a3-8c0f-450b-ac7d-850740ed2251', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:22:23.131579', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '7443f334-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.208115873, 'message_signature': 'cd1e7d253763b0047507ece032cab7d3ae4fd88a7cf745e0dc6356414452f19b'}]}, 'timestamp': '2025-12-06 10:22:23.131897', '_unique_id': '2f195084508a4c6fa509ef67c3bea2c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.133 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.133 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bc82384-efe6-467c-9447-18662b9a4749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:22:23.133266', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '744434d4-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.297981931, 'message_signature': '8c5730373482b86927765e935c399f014fd670b797c33453c4ed9a712ef2a26d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:22:23.133266', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74443f2e-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 12930.297981931, 'message_signature': '83c5ca304e2913c24b9a859c007e36ba218c2013c5e077425c4e03a6082372d9'}]}, 'timestamp': '2025-12-06 10:22:23.133806', '_unique_id': 'bed5f90cb3eb4852a8c04867b3dadd57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:22:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:22:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:22:23 localhost podman[197801]: time="2025-12-06T10:22:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:22:23 localhost podman[197801]: @ - - [06/Dec/2025:10:22:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:22:23 localhost podman[197801]: @ - - [06/Dec/2025:10:22:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15974 "" "Go-http-client/1.1" Dec 6 05:22:23 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:23.367 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:22:23 localhost systemd[1]: tmp-crun.n06JLy.mount: Deactivated successfully. Dec 6 05:22:23 localhost podman[256063]: 2025-12-06 10:22:23.553890032 +0000 UTC m=+0.087519966 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:22:23 localhost podman[256063]: 2025-12-06 10:22:23.586194588 +0000 UTC m=+0.119824522 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:22:23 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:22:23 localhost nova_compute[237281]: 2025-12-06 10:22:23.743 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15662 DF PROTO=TCP SPT=52728 DPT=9102 SEQ=3354546070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD630A0000000001030307) Dec 6 05:22:24 localhost ovn_controller[131684]: 2025-12-06T10:22:24Z|00287|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:22:24 localhost nova_compute[237281]: 2025-12-06 10:22:24.363 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15663 DF PROTO=TCP SPT=52728 DPT=9102 SEQ=3354546070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD67070000000001030307) Dec 6 05:22:25 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:25.444 2 INFO neutron.agent.securitygroups_rpc [None req-1f723aa3-792d-425a-9080-58b8751bb2be 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:26 localhost nova_compute[237281]: 2025-12-06 10:22:26.023 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14218 DF PROTO=TCP SPT=34126 DPT=9102 SEQ=880347751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD6B870000000001030307) Dec 6 05:22:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:26.326 219384 INFO neutron.agent.linux.ip_lib [None req-88f19c42-af58-4f46-93b1-28ff135c63f5 - - - - - -] Device tapbe54a594-52 cannot be used as it has no MAC address#033[00m Dec 6 05:22:26 localhost nova_compute[237281]: 2025-12-06 10:22:26.351 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:26 localhost kernel: device tapbe54a594-52 entered promiscuous mode Dec 6 05:22:26 localhost nova_compute[237281]: 2025-12-06 10:22:26.359 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:26 localhost NetworkManager[5965]: [1765016546.3608] manager: (tapbe54a594-52): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Dec 6 05:22:26 localhost systemd-udevd[256096]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:22:26 localhost ovn_controller[131684]: 2025-12-06T10:22:26Z|00288|binding|INFO|Claiming lport be54a594-52a9-4aff-a977-293c57d56a9e for this chassis. Dec 6 05:22:26 localhost ovn_controller[131684]: 2025-12-06T10:22:26Z|00289|binding|INFO|be54a594-52a9-4aff-a977-293c57d56a9e: Claiming unknown Dec 6 05:22:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:26.377 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe41:7487/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=be54a594-52a9-4aff-a977-293c57d56a9e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:26.381 137259 INFO neutron.agent.ovn.metadata.agent [-] Port be54a594-52a9-4aff-a977-293c57d56a9e in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:22:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:26.384 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port b0b9b410-316b-4bfa-bb67-ebcd6f8f989c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:22:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:26.384 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:26.386 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[25482088-01b9-42ba-ba76-b696db5f09c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost ovn_controller[131684]: 2025-12-06T10:22:26Z|00290|binding|INFO|Setting lport be54a594-52a9-4aff-a977-293c57d56a9e ovn-installed in OVS Dec 6 05:22:26 localhost ovn_controller[131684]: 2025-12-06T10:22:26Z|00291|binding|INFO|Setting lport be54a594-52a9-4aff-a977-293c57d56a9e up in Southbound Dec 6 05:22:26 localhost nova_compute[237281]: 2025-12-06 10:22:26.414 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost journal[186952]: ethtool ioctl error on tapbe54a594-52: No such device Dec 6 05:22:26 localhost nova_compute[237281]: 2025-12-06 10:22:26.457 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:26 localhost nova_compute[237281]: 2025-12-06 10:22:26.488 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:26 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:26.540 2 INFO neutron.agent.securitygroups_rpc [None req-b48de235-a44a-4858-a760-8e6dfd39e12b d41bb60070b14b9c80a14ee7cc324525 a267350e653e4d5abdac53423497b0e5 - - default default] Security group member updated ['d49af03e-7127-4747-bdc2-f2f5102a52fa']#033[00m Dec 6 05:22:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15664 DF PROTO=TCP SPT=52728 DPT=9102 SEQ=3354546070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD6F070000000001030307) Dec 6 05:22:27 localhost podman[256165]: Dec 6 05:22:27 localhost podman[256165]: 2025-12-06 10:22:27.344978805 +0000 UTC m=+0.106903043 container create c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:27 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:27.360 2 INFO neutron.agent.securitygroups_rpc [None req-d95a9741-d0e1-464e-a11d-e2d0f52f4e95 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:27 localhost systemd[1]: Started libpod-conmon-c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c.scope. Dec 6 05:22:27 localhost podman[256165]: 2025-12-06 10:22:27.285203654 +0000 UTC m=+0.047127912 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:27 localhost systemd[1]: Started libcrun container. Dec 6 05:22:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83f5f22da8d5c844f0f8bc63e217d1f21224926dc01ab085e88c2c41625b8d28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:27 localhost podman[256165]: 2025-12-06 10:22:27.41397513 +0000 UTC m=+0.175899358 container init c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:27 localhost podman[256165]: 2025-12-06 10:22:27.422968747 +0000 UTC m=+0.184892975 container start c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 05:22:27 localhost dnsmasq[256183]: started, version 2.85 cachesize 150 Dec 6 05:22:27 localhost dnsmasq[256183]: DNS service limited to local subnets Dec 6 05:22:27 localhost dnsmasq[256183]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:27 localhost dnsmasq[256183]: warning: no upstream servers configured Dec 6 05:22:27 localhost dnsmasq-dhcp[256183]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:22:27 localhost dnsmasq[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:27 localhost dnsmasq-dhcp[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:27 localhost dnsmasq-dhcp[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:27.757 219384 INFO neutron.agent.dhcp.agent [None req-43659783-dbac-4b17-9915-ca9054f8d290 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:22:27 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:27.772 2 INFO neutron.agent.securitygroups_rpc [None req-233821a2-d98a-4a77-b9c8-bac6a672f0a1 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:27 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:27.870 2 INFO neutron.agent.securitygroups_rpc [None req-cce42fb9-5f11-44f9-b04d-9768a90c90ea f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:22:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:27.870 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:26Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ae3f8203-fe95-4aed-a32c-7778bed9d69d, ip_allocation=immediate, mac_address=fa:16:3e:92:50:46, name=tempest-NetworksTestDHCPv6-140121895, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['4610dc97-3de0-4b7e-abc9-ef40e0f7cc16'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:22Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1620, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:27Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:22:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:27.927 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:28 localhost dnsmasq[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:22:28 localhost podman[256202]: 2025-12-06 10:22:28.065322561 +0000 UTC m=+0.062237358 container kill c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:22:28 localhost dnsmasq-dhcp[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:28 localhost dnsmasq-dhcp[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31712 DF PROTO=TCP SPT=50314 DPT=9102 SEQ=3962457516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD73880000000001030307) Dec 6 05:22:28 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:28.335 219384 INFO neutron.agent.dhcp.agent [None req-c8e4a7b8-e2db-4640-a015-046cff5f9fff - - - - - -] DHCP configuration for ports {'ae3f8203-fe95-4aed-a32c-7778bed9d69d'} is completed#033[00m Dec 6 05:22:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:28.435 2 INFO neutron.agent.securitygroups_rpc [None req-0eecfaba-0bd5-41d1-a07a-0011d2c617b9 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:28.742 2 INFO neutron.agent.securitygroups_rpc [None req-4ae972ed-20c1-47d7-8f93-50fc5e0a42a5 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:22:28 localhost nova_compute[237281]: 2025-12-06 10:22:28.789 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:28.789 2 INFO neutron.agent.securitygroups_rpc [None req-e2097887-ad35-4221-bebf-4e6fad4d647d d41bb60070b14b9c80a14ee7cc324525 a267350e653e4d5abdac53423497b0e5 - - default default] Security group member updated ['d49af03e-7127-4747-bdc2-f2f5102a52fa']#033[00m Dec 6 05:22:29 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:29.388 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:30 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:30.310 2 INFO neutron.agent.securitygroups_rpc [None req-522001ee-0279-4771-8d27-0641c0f37bfa a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:30 localhost podman[256240]: 2025-12-06 10:22:30.564883435 +0000 UTC m=+0.062872977 container kill c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:22:30 localhost dnsmasq[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:30 localhost dnsmasq-dhcp[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:30 localhost dnsmasq-dhcp[256183]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:30 localhost systemd[1]: tmp-crun.45w7nB.mount: Deactivated successfully. Dec 6 05:22:31 localhost nova_compute[237281]: 2025-12-06 10:22:31.062 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15665 DF PROTO=TCP SPT=52728 DPT=9102 SEQ=3354546070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD7EC80000000001030307) Dec 6 05:22:31 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:31.144 2 INFO neutron.agent.securitygroups_rpc [None req-1f41419c-cc19-4ad0-a8e8-ca562a125f6b 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:32 localhost nova_compute[237281]: 2025-12-06 10:22:32.121 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:32 localhost nova_compute[237281]: 2025-12-06 10:22:32.147 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Triggering sync for uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 6 05:22:32 localhost nova_compute[237281]: 2025-12-06 10:22:32.147 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:22:32 localhost nova_compute[237281]: 2025-12-06 10:22:32.148 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:22:32 localhost nova_compute[237281]: 2025-12-06 10:22:32.181 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "a5070ada-6b60-4992-a1bf-9e83aaccac93" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:22:32 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:32.249 2 INFO neutron.agent.securitygroups_rpc [None req-ba9de83c-1f23-4aff-a298-498a377026f8 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:22:32 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:32.428 2 INFO neutron.agent.securitygroups_rpc [None req-ba9de83c-1f23-4aff-a298-498a377026f8 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:22:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:22:32 localhost podman[256260]: 2025-12-06 10:22:32.550902153 +0000 UTC m=+0.080769469 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible) Dec 6 05:22:32 localhost podman[256300]: 2025-12-06 10:22:32.646131596 +0000 UTC m=+0.057773241 container kill c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:22:32 localhost dnsmasq[256183]: exiting on receipt of SIGTERM Dec 6 05:22:32 localhost systemd[1]: libpod-c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c.scope: Deactivated successfully. Dec 6 05:22:32 localhost podman[256260]: 2025-12-06 10:22:32.656786494 +0000 UTC m=+0.186653800 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller) Dec 6 05:22:32 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:22:32 localhost podman[256317]: 2025-12-06 10:22:32.719913158 +0000 UTC m=+0.050372222 container died c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:32 localhost systemd[1]: tmp-crun.vQK0vV.mount: Deactivated successfully. Dec 6 05:22:32 localhost podman[256317]: 2025-12-06 10:22:32.818716132 +0000 UTC m=+0.149175156 container remove c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:22:32 localhost systemd[1]: libpod-conmon-c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c.scope: Deactivated successfully. Dec 6 05:22:32 localhost nova_compute[237281]: 2025-12-06 10:22:32.831 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:32 localhost ovn_controller[131684]: 2025-12-06T10:22:32Z|00292|binding|INFO|Releasing lport be54a594-52a9-4aff-a977-293c57d56a9e from this chassis (sb_readonly=0) Dec 6 05:22:32 localhost ovn_controller[131684]: 2025-12-06T10:22:32Z|00293|binding|INFO|Setting lport be54a594-52a9-4aff-a977-293c57d56a9e down in Southbound Dec 6 05:22:32 localhost kernel: device tapbe54a594-52 left promiscuous mode Dec 6 05:22:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:32.842 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe41:7487/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=be54a594-52a9-4aff-a977-293c57d56a9e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:32.846 137259 INFO neutron.agent.ovn.metadata.agent [-] Port be54a594-52a9-4aff-a977-293c57d56a9e in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:22:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:32.849 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:32 localhost nova_compute[237281]: 2025-12-06 10:22:32.853 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:32 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:32.853 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0d142e-06b4-4955-823e-f58a9de6177a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:33 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:33.223 2 INFO neutron.agent.securitygroups_rpc [None req-1dbf7728-aa42-497d-b635-bc449175d210 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:22:33 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:33.239 219384 INFO neutron.agent.dhcp.agent [None req-c22c33ef-fd8c-4b42-a12d-9f8db5e2ea31 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:33 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:33.388 2 INFO neutron.agent.securitygroups_rpc [None req-ed83946e-a7f1-4298-bc84-29a3b0637d4e 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:33 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:33.397 2 INFO neutron.agent.securitygroups_rpc [None req-0adcf7bc-4759-4f88-bfb3-4109b376d368 db2a3efec06741ba90593f5e4f69699c 550837a56e0743b2a37205ba11b8980a - - default default] Security group member updated ['e62a34f0-cf09-468f-aee3-4ce561a364e6']#033[00m Dec 6 05:22:33 localhost systemd[1]: var-lib-containers-storage-overlay-83f5f22da8d5c844f0f8bc63e217d1f21224926dc01ab085e88c2c41625b8d28-merged.mount: Deactivated successfully. Dec 6 05:22:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c55f8342c729987dcb0902fcb84a15a37f1d6e915d13f12f37d49c6e8ee5cd9c-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:33 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:22:33 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:33.773 2 INFO neutron.agent.securitygroups_rpc [None req-ff211a20-7964-43ae-b4b5-296104c6e6fc f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:22:33 localhost nova_compute[237281]: 2025-12-06 10:22:33.822 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:22:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:22:35 localhost podman[256343]: 2025-12-06 10:22:35.577946214 +0000 UTC m=+0.107906645 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:22:35 localhost podman[256343]: 2025-12-06 10:22:35.609195626 +0000 UTC m=+0.139156097 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:22:35 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:22:35 localhost podman[256362]: 2025-12-06 10:22:35.672686541 +0000 UTC m=+0.084308698 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd) Dec 6 05:22:35 localhost podman[256362]: 2025-12-06 10:22:35.709485105 +0000 UTC m=+0.121107232 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:35 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:22:36 localhost nova_compute[237281]: 2025-12-06 10:22:36.100 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:37 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:37.102 2 INFO neutron.agent.securitygroups_rpc [None req-af0b22e8-e7f6-429d-b20d-8db1e51b364e a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:37.736 219384 INFO neutron.agent.linux.ip_lib [None req-9b18bcba-bc5f-4dff-a6b9-16a19b7e7b3d - - - - - -] Device tapf4d2d8f5-80 cannot be used as it has no MAC address#033[00m Dec 6 05:22:37 localhost nova_compute[237281]: 2025-12-06 10:22:37.756 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:37 localhost kernel: device tapf4d2d8f5-80 entered promiscuous mode Dec 6 05:22:37 localhost NetworkManager[5965]: [1765016557.7633] manager: (tapf4d2d8f5-80): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Dec 6 05:22:37 localhost ovn_controller[131684]: 2025-12-06T10:22:37Z|00294|binding|INFO|Claiming lport f4d2d8f5-8065-4763-89db-3995bcf9ae84 for this chassis. Dec 6 05:22:37 localhost ovn_controller[131684]: 2025-12-06T10:22:37Z|00295|binding|INFO|f4d2d8f5-8065-4763-89db-3995bcf9ae84: Claiming unknown Dec 6 05:22:37 localhost nova_compute[237281]: 2025-12-06 10:22:37.766 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:37 localhost systemd-udevd[256396]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:22:37 localhost ovn_controller[131684]: 2025-12-06T10:22:37Z|00296|binding|INFO|Setting lport f4d2d8f5-8065-4763-89db-3995bcf9ae84 ovn-installed in OVS Dec 6 05:22:37 localhost nova_compute[237281]: 2025-12-06 10:22:37.771 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:37 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:37.773 2 INFO neutron.agent.securitygroups_rpc [None req-2d92f70b-092e-4c87-bb27-cbb12efb5f50 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:22:37 localhost nova_compute[237281]: 2025-12-06 10:22:37.777 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:37 localhost ovn_controller[131684]: 2025-12-06T10:22:37Z|00297|binding|INFO|Setting lport f4d2d8f5-8065-4763-89db-3995bcf9ae84 up in Southbound Dec 6 05:22:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:37.784 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f4d2d8f5-8065-4763-89db-3995bcf9ae84) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:37.785 137259 INFO neutron.agent.ovn.metadata.agent [-] Port f4d2d8f5-8065-4763-89db-3995bcf9ae84 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:22:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:37.787 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ed5229ef-ae0a-4be1-822c-a0e1229e21ab IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:22:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:37.787 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:37.787 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0d5bfb-8816-4203-be3d-a46ba2b61d23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:37 localhost nova_compute[237281]: 2025-12-06 10:22:37.808 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:37.825 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:37 localhost nova_compute[237281]: 2025-12-06 10:22:37.852 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:37 localhost nova_compute[237281]: 2025-12-06 10:22:37.881 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:38.006 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:38 localhost podman[256451]: Dec 6 05:22:38 localhost podman[256451]: 2025-12-06 10:22:38.710891655 +0000 UTC m=+0.092198521 container create cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:22:38 localhost systemd[1]: Started libpod-conmon-cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8.scope. Dec 6 05:22:38 localhost podman[256451]: 2025-12-06 10:22:38.661321908 +0000 UTC m=+0.042628814 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:38 localhost systemd[1]: Started libcrun container. Dec 6 05:22:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b55b0f2e210a9c9dadc32e3c1c2a3cfbe58cc890c8cda638ee0dc2b9176a58b1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:38 localhost podman[256451]: 2025-12-06 10:22:38.786028519 +0000 UTC m=+0.167335365 container init cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:22:38 localhost podman[256451]: 2025-12-06 10:22:38.793496488 +0000 UTC m=+0.174803354 container start cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:22:38 localhost dnsmasq[256469]: started, version 2.85 cachesize 150 Dec 6 05:22:38 localhost dnsmasq[256469]: DNS service limited to local subnets Dec 6 05:22:38 localhost dnsmasq[256469]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:38 localhost dnsmasq[256469]: warning: no upstream servers configured Dec 6 05:22:38 localhost dnsmasq-dhcp[256469]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:22:38 localhost dnsmasq[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:38 localhost dnsmasq-dhcp[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:38 localhost dnsmasq-dhcp[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:38 localhost nova_compute[237281]: 2025-12-06 10:22:38.864 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:38.864 219384 INFO neutron.agent.dhcp.agent [None req-9b18bcba-bc5f-4dff-a6b9-16a19b7e7b3d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:34Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=94383170-b700-4db8-8910-e3e61d3fe921, ip_allocation=immediate, mac_address=fa:16:3e:3e:41:11, name=tempest-NetworksTestDHCPv6-1692804069, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['20326ef1-3640-4fb7-8d1b-1c1feb36dccb'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:33Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1648, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:34Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:22:38 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:38.868 2 INFO neutron.agent.securitygroups_rpc [None req-f09cb90a-1d51-470f-9ee7-42db71fdde53 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:38.983 219384 INFO neutron.agent.dhcp.agent [None req-a02e0162-5886-4a3f-89b0-4316c4fb73e1 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:22:39 localhost podman[256486]: 2025-12-06 10:22:39.058990695 +0000 UTC m=+0.062031991 container kill cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:22:39 localhost dnsmasq[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 1 addresses Dec 6 05:22:39 localhost dnsmasq-dhcp[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:39 localhost dnsmasq-dhcp[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:39 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:39.390 219384 INFO neutron.agent.dhcp.agent [None req-1c494028-baee-4478-a585-70e92da22171 - - - - - -] DHCP configuration for ports {'94383170-b700-4db8-8910-e3e61d3fe921'} is completed#033[00m Dec 6 05:22:39 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:39.398 2 INFO neutron.agent.securitygroups_rpc [None req-abedaad8-211a-46d6-9364-030f959d4420 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15666 DF PROTO=TCP SPT=52728 DPT=9102 SEQ=3354546070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDD9F870000000001030307) Dec 6 05:22:40 localhost systemd[1]: tmp-crun.AqkNBb.mount: Deactivated successfully. Dec 6 05:22:40 localhost dnsmasq[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:40 localhost dnsmasq-dhcp[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:40 localhost podman[256523]: 2025-12-06 10:22:40.13970242 +0000 UTC m=+0.063404344 container kill cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:22:40 localhost dnsmasq-dhcp[256469]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:41 localhost nova_compute[237281]: 2025-12-06 10:22:41.137 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:22:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:22:41 localhost podman[256559]: 2025-12-06 10:22:41.537747858 +0000 UTC m=+0.070699149 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:41 localhost podman[256559]: 2025-12-06 10:22:41.543605928 +0000 UTC m=+0.076557219 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:22:41 localhost podman[256581]: 2025-12-06 10:22:41.561429728 +0000 UTC m=+0.057192263 container kill cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:22:41 localhost dnsmasq[256469]: exiting on receipt of SIGTERM Dec 6 05:22:41 localhost systemd[1]: libpod-cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8.scope: Deactivated successfully. Dec 6 05:22:41 localhost podman[256560]: 2025-12-06 10:22:41.607694842 +0000 UTC m=+0.136360241 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:41 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:22:41 localhost podman[256599]: 2025-12-06 10:22:41.646597011 +0000 UTC m=+0.076007472 container died cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:22:41 localhost podman[256599]: 2025-12-06 10:22:41.681065662 +0000 UTC m=+0.110476053 container cleanup cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:22:41 localhost systemd[1]: libpod-conmon-cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8.scope: Deactivated successfully. Dec 6 05:22:41 localhost podman[256560]: 2025-12-06 10:22:41.692087241 +0000 UTC m=+0.220752640 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm) Dec 6 05:22:41 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:22:41 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:41.719 2 INFO neutron.agent.securitygroups_rpc [None req-4be2d517-236e-4e4a-80bd-454567384887 f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:22:41 localhost podman[256606]: 2025-12-06 10:22:41.774331055 +0000 UTC m=+0.186374522 container remove cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:22:41 localhost nova_compute[237281]: 2025-12-06 10:22:41.787 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:41 localhost kernel: device tapf4d2d8f5-80 left promiscuous mode Dec 6 05:22:41 localhost ovn_controller[131684]: 2025-12-06T10:22:41Z|00298|binding|INFO|Releasing lport f4d2d8f5-8065-4763-89db-3995bcf9ae84 from this chassis (sb_readonly=0) Dec 6 05:22:41 localhost ovn_controller[131684]: 2025-12-06T10:22:41Z|00299|binding|INFO|Setting lport f4d2d8f5-8065-4763-89db-3995bcf9ae84 down in Southbound Dec 6 05:22:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:41.798 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f4d2d8f5-8065-4763-89db-3995bcf9ae84) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:41.799 137259 INFO neutron.agent.ovn.metadata.agent [-] Port f4d2d8f5-8065-4763-89db-3995bcf9ae84 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:22:41 localhost nova_compute[237281]: 2025-12-06 10:22:41.800 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:41.800 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:41.801 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[193917b3-595b-454c-91dd-7472383b4f71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:42.113 219384 INFO neutron.agent.dhcp.agent [None req-2a4e6977-59aa-4a48-ac5a-d7c4574c17eb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:42 localhost systemd[1]: var-lib-containers-storage-overlay-b55b0f2e210a9c9dadc32e3c1c2a3cfbe58cc890c8cda638ee0dc2b9176a58b1-merged.mount: Deactivated successfully. Dec 6 05:22:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd82a1e3ffd930303a3154f6a9d77955ad12a1c8fd939e7fb85255d94b5760d8-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:42 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:22:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:43.019 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:43 localhost nova_compute[237281]: 2025-12-06 10:22:43.867 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:44 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:44.849 2 INFO neutron.agent.securitygroups_rpc [None req-d07d3ff0-29fb-48b8-b9e1-6912ee30fa65 db2a3efec06741ba90593f5e4f69699c 550837a56e0743b2a37205ba11b8980a - - default default] Security group member updated ['e62a34f0-cf09-468f-aee3-4ce561a364e6']#033[00m Dec 6 05:22:45 localhost nova_compute[237281]: 2025-12-06 10:22:45.913 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:46 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:46.116 2 INFO neutron.agent.securitygroups_rpc [None req-ec9d810b-3aaf-4451-83c4-61aabdec1f05 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.169 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:46 localhost openstack_network_exporter[199751]: ERROR 10:22:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:22:46 localhost openstack_network_exporter[199751]: ERROR 10:22:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:22:46 localhost openstack_network_exporter[199751]: ERROR 10:22:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:22:46 localhost openstack_network_exporter[199751]: ERROR 10:22:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:22:46 localhost openstack_network_exporter[199751]: Dec 6 05:22:46 localhost openstack_network_exporter[199751]: ERROR 10:22:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:22:46 localhost openstack_network_exporter[199751]: Dec 6 05:22:46 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:46.480 219384 INFO neutron.agent.linux.ip_lib [None req-0a6a5e7a-6073-4ebe-874c-faddfb1a95d7 - - - - - -] Device tap62faf66b-a2 cannot be used as it has no MAC address#033[00m Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.507 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:46 localhost kernel: device tap62faf66b-a2 entered promiscuous mode Dec 6 05:22:46 localhost NetworkManager[5965]: [1765016566.5176] manager: (tap62faf66b-a2): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Dec 6 05:22:46 localhost ovn_controller[131684]: 2025-12-06T10:22:46Z|00300|binding|INFO|Claiming lport 62faf66b-a260-4866-8ac7-54258193e729 for this chassis. Dec 6 05:22:46 localhost ovn_controller[131684]: 2025-12-06T10:22:46Z|00301|binding|INFO|62faf66b-a260-4866-8ac7-54258193e729: Claiming unknown Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.518 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:46 localhost systemd-udevd[256645]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:22:46 localhost ovn_controller[131684]: 2025-12-06T10:22:46Z|00302|binding|INFO|Setting lport 62faf66b-a260-4866-8ac7-54258193e729 ovn-installed in OVS Dec 6 05:22:46 localhost ovn_controller[131684]: 2025-12-06T10:22:46Z|00303|binding|INFO|Setting lport 62faf66b-a260-4866-8ac7-54258193e729 up in Southbound Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.528 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:46.533 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee0:6e7d/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=62faf66b-a260-4866-8ac7-54258193e729) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:46.535 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 62faf66b-a260-4866-8ac7-54258193e729 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:22:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:46.539 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 29fe70ba-6400-4bb7-bdf6-8dc14c8a800d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:22:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:46.539 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:46 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:46.541 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[54d440d5-c4da-4ee1-bfb6-15cf33a5acbf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.550 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.557 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost journal[186952]: ethtool ioctl error on tap62faf66b-a2: No such device Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.602 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:46 localhost nova_compute[237281]: 2025-12-06 10:22:46.637 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:47 localhost podman[256716]: Dec 6 05:22:47 localhost podman[256716]: 2025-12-06 10:22:47.585330747 +0000 UTC m=+0.097977019 container create f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:47 localhost systemd[1]: Started libpod-conmon-f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c.scope. Dec 6 05:22:47 localhost podman[256716]: 2025-12-06 10:22:47.539774824 +0000 UTC m=+0.052421126 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:47 localhost systemd[1]: Started libcrun container. Dec 6 05:22:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/199ae326cb35bd2a4d3c596278f4bf3518142d3383a82f88c8dd34e575e7e521/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:47 localhost podman[256716]: 2025-12-06 10:22:47.669586592 +0000 UTC m=+0.182232864 container init f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:22:47 localhost podman[256716]: 2025-12-06 10:22:47.681138928 +0000 UTC m=+0.193785240 container start f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:22:47 localhost dnsmasq[256734]: started, version 2.85 cachesize 150 Dec 6 05:22:47 localhost dnsmasq[256734]: DNS service limited to local subnets Dec 6 05:22:47 localhost dnsmasq[256734]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:47 localhost dnsmasq[256734]: warning: no upstream servers configured Dec 6 05:22:47 localhost dnsmasq[256734]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:47 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:47.837 219384 INFO neutron.agent.dhcp.agent [None req-b992d904-24ca-4daa-a7ea-540ea42c808e - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:22:47 localhost nova_compute[237281]: 2025-12-06 10:22:47.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:48 localhost dnsmasq[256734]: exiting on receipt of SIGTERM Dec 6 05:22:48 localhost podman[256752]: 2025-12-06 10:22:48.100261077 +0000 UTC m=+0.065827209 container kill f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:22:48 localhost systemd[1]: libpod-f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c.scope: Deactivated successfully. Dec 6 05:22:48 localhost podman[256765]: 2025-12-06 10:22:48.173168232 +0000 UTC m=+0.059668409 container died f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:22:48 localhost podman[256765]: 2025-12-06 10:22:48.205010293 +0000 UTC m=+0.091510450 container cleanup f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:22:48 localhost systemd[1]: libpod-conmon-f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c.scope: Deactivated successfully. Dec 6 05:22:48 localhost podman[256772]: 2025-12-06 10:22:48.258268043 +0000 UTC m=+0.129754297 container remove f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:22:48 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:48.299 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:48 localhost systemd[1]: var-lib-containers-storage-overlay-199ae326cb35bd2a4d3c596278f4bf3518142d3383a82f88c8dd34e575e7e521-merged.mount: Deactivated successfully. Dec 6 05:22:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4765ed5594c33801817b01bdeb723deebd3a4d6f0aa971c2717d3cc998bc40c-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:48 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:48.656 2 INFO neutron.agent.securitygroups_rpc [None req-243ac228-9b80-49ee-ab61-5889dfbcca80 8d428a9af02040d7874f5c8e8cb180a0 26cfc74da117438ba5bccd2d017524bb - - default default] Security group member updated ['fbf6a84a-34d6-4181-8b4b-01ed3419b79d']#033[00m Dec 6 05:22:48 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:48.664 2 INFO neutron.agent.securitygroups_rpc [None req-c9493728-bd3f-4ea1-ba02-b12f4d3470f1 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:48 localhost nova_compute[237281]: 2025-12-06 10:22:48.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:48 localhost nova_compute[237281]: 2025-12-06 10:22:48.915 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:49 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:49.262 2 INFO neutron.agent.securitygroups_rpc [None req-e86b0c8c-ede5-4a6c-a894-2f9d36fb21dc 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:22:49 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:49.461 2 INFO neutron.agent.securitygroups_rpc [None req-240e0e9f-8ea5-4d37-af5b-57f9c406c85e a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:50 localhost podman[256840]: Dec 6 05:22:50 localhost podman[256840]: 2025-12-06 10:22:50.242746292 +0000 UTC m=+0.093679066 container create ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:22:50 localhost systemd[1]: Started libpod-conmon-ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937.scope. Dec 6 05:22:50 localhost systemd[1]: tmp-crun.RzLor5.mount: Deactivated successfully. Dec 6 05:22:50 localhost podman[256840]: 2025-12-06 10:22:50.196190008 +0000 UTC m=+0.047122832 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:50 localhost systemd[1]: Started libcrun container. Dec 6 05:22:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33e809d161cc24657624dbc4a2785df59f4dce5d25b050a321a1c424a1d54c8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:50 localhost podman[256840]: 2025-12-06 10:22:50.327359478 +0000 UTC m=+0.178292252 container init ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:22:50 localhost podman[256840]: 2025-12-06 10:22:50.336391666 +0000 UTC m=+0.187324430 container start ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:22:50 localhost dnsmasq[256870]: started, version 2.85 cachesize 150 Dec 6 05:22:50 localhost dnsmasq[256870]: DNS service limited to local subnets Dec 6 05:22:50 localhost dnsmasq[256870]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:50 localhost dnsmasq[256870]: warning: no upstream servers configured Dec 6 05:22:50 localhost dnsmasq-dhcp[256870]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 6 05:22:50 localhost dnsmasq[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:50 localhost dnsmasq-dhcp[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:50 localhost dnsmasq-dhcp[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:50 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:50.399 219384 INFO neutron.agent.dhcp.agent [None req-7a617513-7fb8-411d-b5e2-db58e15ff898 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:48Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=08f900b5-edb3-47e5-9b1f-f3c541588c0e, ip_allocation=immediate, mac_address=fa:16:3e:28:49:3c, name=tempest-NetworksTestDHCPv6-509153623, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['d941643c-be7f-4b4c-911c-c6f646349401', 'ec860ba2-4952-4384-a4d2-1dcf5091526d'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:46Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1682, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:22:48Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:22:50 localhost podman[256855]: 2025-12-06 10:22:50.41248199 +0000 UTC m=+0.133315768 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public) Dec 6 05:22:50 localhost podman[256855]: 2025-12-06 10:22:50.456486135 +0000 UTC m=+0.177319913 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Dec 6 05:22:50 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:22:50 localhost podman[256896]: 2025-12-06 10:22:50.608418794 +0000 UTC m=+0.063218178 container kill ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:22:50 localhost dnsmasq[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:22:50 localhost dnsmasq-dhcp[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:50 localhost dnsmasq-dhcp[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:50 localhost nova_compute[237281]: 2025-12-06 10:22:50.883 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:51 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:51.141 2 INFO neutron.agent.securitygroups_rpc [None req-f5f5fc81-a185-4788-bdab-4bf0cca8df55 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:22:51 localhost nova_compute[237281]: 2025-12-06 10:22:51.217 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:51 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:51.284 219384 INFO neutron.agent.dhcp.agent [None req-63a4d3c6-fbaa-41ea-b6fb-74349b915399 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '62faf66b-a260-4866-8ac7-54258193e729'} is completed#033[00m Dec 6 05:22:51 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:51.295 2 INFO neutron.agent.securitygroups_rpc [None req-eed4ff06-3ae4-43a4-8a2f-46c5ce65083f 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:51 localhost dnsmasq[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:51 localhost dnsmasq-dhcp[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:51 localhost dnsmasq-dhcp[256870]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:51 localhost podman[256935]: 2025-12-06 10:22:51.632722813 +0000 UTC m=+0.058620307 container kill ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:51 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:51.725 219384 INFO neutron.agent.dhcp.agent [None req-340f7160-f447-4e06-b1be-a892597814dd - - - - - -] DHCP configuration for ports {'08f900b5-edb3-47e5-9b1f-f3c541588c0e'} is completed#033[00m Dec 6 05:22:52 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:52.699 2 INFO neutron.agent.securitygroups_rpc [None req-45e0804e-0a63-45b7-bb15-e942c88aae58 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:22:52 localhost nova_compute[237281]: 2025-12-06 10:22:52.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:52 localhost nova_compute[237281]: 2025-12-06 10:22:52.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:52 localhost nova_compute[237281]: 2025-12-06 10:22:52.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:22:53 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:53.169 2 INFO neutron.agent.securitygroups_rpc [None req-505fda17-705f-46b0-b635-3c31bb82483a 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:22:53 localhost podman[197801]: time="2025-12-06T10:22:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:22:53 localhost podman[197801]: @ - - [06/Dec/2025:10:22:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 145971 "" "Go-http-client/1.1" Dec 6 05:22:53 localhost podman[197801]: @ - - [06/Dec/2025:10:22:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16437 "" "Go-http-client/1.1" Dec 6 05:22:53 localhost nova_compute[237281]: 2025-12-06 10:22:53.953 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:53 localhost dnsmasq[256870]: exiting on receipt of SIGTERM Dec 6 05:22:53 localhost podman[256973]: 2025-12-06 10:22:53.98697054 +0000 UTC m=+0.060147873 container kill ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:22:54 localhost systemd[1]: libpod-ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937.scope: Deactivated successfully. Dec 6 05:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6143 DF PROTO=TCP SPT=50680 DPT=9102 SEQ=3014205041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDDD8460000000001030307) Dec 6 05:22:54 localhost podman[256987]: 2025-12-06 10:22:54.061358452 +0000 UTC m=+0.055227012 container died ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:22:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:54 localhost systemd[1]: var-lib-containers-storage-overlay-33e809d161cc24657624dbc4a2785df59f4dce5d25b050a321a1c424a1d54c8d-merged.mount: Deactivated successfully. Dec 6 05:22:54 localhost podman[256987]: 2025-12-06 10:22:54.162873108 +0000 UTC m=+0.156741628 container remove ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:22:54 localhost systemd[1]: libpod-conmon-ce3dda3349ae4f29cc568e149e23f5a30ed272cf7a0ee1cbec2fb6faa3366937.scope: Deactivated successfully. Dec 6 05:22:54 localhost podman[256998]: 2025-12-06 10:22:54.17462004 +0000 UTC m=+0.153843199 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:22:54 localhost podman[256998]: 2025-12-06 10:22:54.211387902 +0000 UTC m=+0.190611091 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:22:54 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:22:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6144 DF PROTO=TCP SPT=50680 DPT=9102 SEQ=3014205041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDDDC480000000001030307) Dec 6 05:22:55 localhost podman[257088]: Dec 6 05:22:55 localhost podman[257088]: 2025-12-06 10:22:55.180035686 +0000 UTC m=+0.096300657 container create 1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:55 localhost systemd[1]: Started libpod-conmon-1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b.scope. Dec 6 05:22:55 localhost systemd[1]: Started libcrun container. Dec 6 05:22:55 localhost podman[257088]: 2025-12-06 10:22:55.135273957 +0000 UTC m=+0.051538968 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:22:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8cc242a296090f3b951979ba7f9a6d7ef37883a9902e31ca8854eaf42805614/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:22:55 localhost podman[257088]: 2025-12-06 10:22:55.247811313 +0000 UTC m=+0.164076314 container init 1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:22:55 localhost podman[257088]: 2025-12-06 10:22:55.257638796 +0000 UTC m=+0.173903797 container start 1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:22:55 localhost dnsmasq[257106]: started, version 2.85 cachesize 150 Dec 6 05:22:55 localhost dnsmasq[257106]: DNS service limited to local subnets Dec 6 05:22:55 localhost dnsmasq[257106]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:22:55 localhost dnsmasq[257106]: warning: no upstream servers configured Dec 6 05:22:55 localhost dnsmasq-dhcp[257106]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 6 05:22:55 localhost dnsmasq[257106]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:22:55 localhost dnsmasq-dhcp[257106]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:22:55 localhost dnsmasq-dhcp[257106]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:22:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:55.546 219384 INFO neutron.agent.dhcp.agent [None req-c7c13bd0-a466-4431-abd1-14e9a38db767 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '62faf66b-a260-4866-8ac7-54258193e729'} is completed#033[00m Dec 6 05:22:55 localhost dnsmasq[257106]: exiting on receipt of SIGTERM Dec 6 05:22:55 localhost podman[257122]: 2025-12-06 10:22:55.623971478 +0000 UTC m=+0.064684753 container kill 1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:22:55 localhost systemd[1]: libpod-1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b.scope: Deactivated successfully. Dec 6 05:22:55 localhost podman[257135]: 2025-12-06 10:22:55.701496716 +0000 UTC m=+0.065284671 container died 1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:22:55 localhost podman[257135]: 2025-12-06 10:22:55.74090811 +0000 UTC m=+0.104696005 container cleanup 1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:22:55 localhost systemd[1]: libpod-conmon-1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b.scope: Deactivated successfully. Dec 6 05:22:55 localhost podman[257142]: 2025-12-06 10:22:55.793005525 +0000 UTC m=+0.139104666 container remove 1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:22:55 localhost kernel: device tap62faf66b-a2 left promiscuous mode Dec 6 05:22:55 localhost ovn_controller[131684]: 2025-12-06T10:22:55Z|00304|binding|INFO|Releasing lport 62faf66b-a260-4866-8ac7-54258193e729 from this chassis (sb_readonly=0) Dec 6 05:22:55 localhost nova_compute[237281]: 2025-12-06 10:22:55.807 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:55 localhost ovn_controller[131684]: 2025-12-06T10:22:55Z|00305|binding|INFO|Setting lport 62faf66b-a260-4866-8ac7-54258193e729 down in Southbound Dec 6 05:22:55 localhost nova_compute[237281]: 2025-12-06 10:22:55.823 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15667 DF PROTO=TCP SPT=52728 DPT=9102 SEQ=3354546070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDDDF880000000001030307) Dec 6 05:22:55 localhost nova_compute[237281]: 2025-12-06 10:22:55.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:22:55 localhost nova_compute[237281]: 2025-12-06 10:22:55.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:22:55 localhost nova_compute[237281]: 2025-12-06 10:22:55.889 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:22:55 localhost systemd[1]: var-lib-containers-storage-overlay-c8cc242a296090f3b951979ba7f9a6d7ef37883a9902e31ca8854eaf42805614-merged.mount: Deactivated successfully. Dec 6 05:22:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cf42da781b79d7975a4d92b73678399060b47768f84a0454515f1dac2fb5b4b-userdata-shm.mount: Deactivated successfully. Dec 6 05:22:56 localhost nova_compute[237281]: 2025-12-06 10:22:56.220 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:22:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:56.497 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=62faf66b-a260-4866-8ac7-54258193e729) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:22:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:56.498 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 62faf66b-a260-4866-8ac7-54258193e729 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:22:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:56.500 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:22:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:22:56.500 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[7857633f-0021-4ea5-be7b-ab624a2a43e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:22:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6145 DF PROTO=TCP SPT=50680 DPT=9102 SEQ=3014205041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDDE4470000000001030307) Dec 6 05:22:57 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:22:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:22:57.336 219384 INFO neutron.agent.dhcp.agent [None req-c9330526-37e6-4e0f-9aee-56dfff5c74ff - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:22:57 localhost nova_compute[237281]: 2025-12-06 10:22:57.627 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:22:57 localhost nova_compute[237281]: 2025-12-06 10:22:57.628 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:22:57 localhost nova_compute[237281]: 2025-12-06 10:22:57.628 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:22:57 localhost nova_compute[237281]: 2025-12-06 10:22:57.628 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:22:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14219 DF PROTO=TCP SPT=34126 DPT=9102 SEQ=880347751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDDE9870000000001030307) Dec 6 05:22:58 localhost neutron_sriov_agent[212548]: 2025-12-06 10:22:58.919 2 INFO neutron.agent.securitygroups_rpc [None req-c328b0ce-535c-4aff-bf8c-7bab6e9d0c35 8d428a9af02040d7874f5c8e8cb180a0 26cfc74da117438ba5bccd2d017524bb - - default default] Security group member updated ['fbf6a84a-34d6-4181-8b4b-01ed3419b79d']#033[00m Dec 6 05:22:58 localhost nova_compute[237281]: 2025-12-06 10:22:58.994 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6146 DF PROTO=TCP SPT=50680 DPT=9102 SEQ=3014205041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDDF4080000000001030307) Dec 6 05:23:01 localhost nova_compute[237281]: 2025-12-06 10:23:01.225 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:01 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:01.782 219384 INFO neutron.agent.linux.ip_lib [None req-b093cd0c-42dd-4393-981d-216657281206 - - - - - -] Device tap65ec618c-f9 cannot be used as it has no MAC address#033[00m Dec 6 05:23:01 localhost nova_compute[237281]: 2025-12-06 10:23:01.813 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:01 localhost kernel: device tap65ec618c-f9 entered promiscuous mode Dec 6 05:23:01 localhost NetworkManager[5965]: [1765016581.8248] manager: (tap65ec618c-f9): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Dec 6 05:23:01 localhost nova_compute[237281]: 2025-12-06 10:23:01.826 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:01 localhost ovn_controller[131684]: 2025-12-06T10:23:01Z|00306|binding|INFO|Claiming lport 65ec618c-f9ca-4c65-9c09-a1cb72b37ac7 for this chassis. Dec 6 05:23:01 localhost ovn_controller[131684]: 2025-12-06T10:23:01Z|00307|binding|INFO|65ec618c-f9ca-4c65-9c09-a1cb72b37ac7: Claiming unknown Dec 6 05:23:01 localhost systemd-udevd[257178]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:23:01 localhost ovn_controller[131684]: 2025-12-06T10:23:01Z|00308|binding|INFO|Setting lport 65ec618c-f9ca-4c65-9c09-a1cb72b37ac7 ovn-installed in OVS Dec 6 05:23:01 localhost nova_compute[237281]: 2025-12-06 10:23:01.835 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:01 localhost ovn_controller[131684]: 2025-12-06T10:23:01Z|00309|binding|INFO|Setting lport 65ec618c-f9ca-4c65-9c09-a1cb72b37ac7 up in Southbound Dec 6 05:23:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:01.844 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe34:1ab3/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=65ec618c-f9ca-4c65-9c09-a1cb72b37ac7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:01.846 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 65ec618c-f9ca-4c65-9c09-a1cb72b37ac7 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:23:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:01.848 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port f794e092-e7d2-49f9-902c-904cc6205e49 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:23:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:01.849 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:01.850 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ecdb2c66-a247-435e-bd25-7549cdce31f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost nova_compute[237281]: 2025-12-06 10:23:01.863 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost journal[186952]: ethtool ioctl error on tap65ec618c-f9: No such device Dec 6 05:23:01 localhost nova_compute[237281]: 2025-12-06 10:23:01.907 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:01 localhost nova_compute[237281]: 2025-12-06 10:23:01.944 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:02 localhost nova_compute[237281]: 2025-12-06 10:23:02.244 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:02 localhost podman[257249]: Dec 6 05:23:02 localhost podman[257249]: 2025-12-06 10:23:02.920402361 +0000 UTC m=+0.097226786 container create 790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:23:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:23:02 localhost systemd[1]: Started libpod-conmon-790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2.scope. Dec 6 05:23:02 localhost podman[257249]: 2025-12-06 10:23:02.877251763 +0000 UTC m=+0.054076218 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:02 localhost systemd[1]: Started libcrun container. Dec 6 05:23:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/241df1fca1a04d575efa5b671b4f82913d40edebd2677c2064eaba3cb7390d1e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:03 localhost podman[257249]: 2025-12-06 10:23:03.005021397 +0000 UTC m=+0.181845832 container init 790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:23:03 localhost systemd[1]: tmp-crun.YYxaUd.mount: Deactivated successfully. Dec 6 05:23:03 localhost dnsmasq[257279]: started, version 2.85 cachesize 150 Dec 6 05:23:03 localhost dnsmasq[257279]: DNS service limited to local subnets Dec 6 05:23:03 localhost dnsmasq[257279]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:03 localhost dnsmasq[257279]: warning: no upstream servers configured Dec 6 05:23:03 localhost dnsmasq-dhcp[257279]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:03 localhost dnsmasq[257279]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:03 localhost dnsmasq-dhcp[257279]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:03 localhost dnsmasq-dhcp[257279]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:03 localhost podman[257263]: 2025-12-06 10:23:03.055214083 +0000 UTC m=+0.093875062 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_controller) Dec 6 05:23:03 localhost podman[257249]: 2025-12-06 10:23:03.070978129 +0000 UTC m=+0.247802564 container start 790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:03 localhost podman[257263]: 2025-12-06 10:23:03.104484701 +0000 UTC m=+0.143145680 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:23:03 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:23:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:03.337 219384 INFO neutron.agent.dhcp.agent [None req-fc25944f-80e6-4778-994f-bfb57bef9456 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:23:04 localhost nova_compute[237281]: 2025-12-06 10:23:04.036 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:05 localhost dnsmasq[257279]: exiting on receipt of SIGTERM Dec 6 05:23:05 localhost podman[257307]: 2025-12-06 10:23:05.462038461 +0000 UTC m=+0.063748145 container kill 790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:23:05 localhost systemd[1]: libpod-790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2.scope: Deactivated successfully. Dec 6 05:23:05 localhost podman[257319]: 2025-12-06 10:23:05.528992733 +0000 UTC m=+0.055299705 container died 790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:23:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:05 localhost podman[257319]: 2025-12-06 10:23:05.562637799 +0000 UTC m=+0.088944741 container cleanup 790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:23:05 localhost systemd[1]: libpod-conmon-790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2.scope: Deactivated successfully. Dec 6 05:23:05 localhost podman[257325]: 2025-12-06 10:23:05.637376751 +0000 UTC m=+0.148878186 container remove 790719ef89bf2ffef7096c9d990c8f88a8796d24eea742b3bf9dd7730b1fd4c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:23:06 localhost nova_compute[237281]: 2025-12-06 10:23:06.112 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:23:06 localhost nova_compute[237281]: 2025-12-06 10:23:06.229 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:23:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:23:06 localhost systemd[1]: var-lib-containers-storage-overlay-241df1fca1a04d575efa5b671b4f82913d40edebd2677c2064eaba3cb7390d1e-merged.mount: Deactivated successfully. Dec 6 05:23:06 localhost nova_compute[237281]: 2025-12-06 10:23:06.467 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:23:06 localhost nova_compute[237281]: 2025-12-06 10:23:06.468 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:23:06 localhost nova_compute[237281]: 2025-12-06 10:23:06.469 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:06 localhost nova_compute[237281]: 2025-12-06 10:23:06.470 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:06 localhost podman[257348]: 2025-12-06 10:23:06.564370302 +0000 UTC m=+0.091495409 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:23:06 localhost systemd[1]: tmp-crun.GHHf0U.mount: Deactivated successfully. Dec 6 05:23:06 localhost podman[257349]: 2025-12-06 10:23:06.625887857 +0000 UTC m=+0.150110234 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3) Dec 6 05:23:06 localhost podman[257349]: 2025-12-06 10:23:06.64027652 +0000 UTC m=+0.164498897 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:23:06 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:23:06 localhost podman[257348]: 2025-12-06 10:23:06.678295081 +0000 UTC m=+0.205420228 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:23:06 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:23:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:06.707 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:23:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:06.708 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:23:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:06.708 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:23:07 localhost nova_compute[237281]: 2025-12-06 10:23:07.710 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:23:07 localhost nova_compute[237281]: 2025-12-06 10:23:07.710 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:23:07 localhost nova_compute[237281]: 2025-12-06 10:23:07.711 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:23:07 localhost nova_compute[237281]: 2025-12-06 10:23:07.711 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:23:08 localhost nova_compute[237281]: 2025-12-06 10:23:08.896 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:23:08 localhost nova_compute[237281]: 2025-12-06 10:23:08.975 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:23:08 localhost nova_compute[237281]: 2025-12-06 10:23:08.977 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.055 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.056 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.071 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.118 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.118 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.192 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6147 DF PROTO=TCP SPT=50680 DPT=9102 SEQ=3014205041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE139B0000000001030307) Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.446 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.448 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12321MB free_disk=387.26658630371094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.449 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.449 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.706 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.707 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.707 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.820 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.837 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.839 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:23:09 localhost nova_compute[237281]: 2025-12-06 10:23:09.839 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:23:10 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:10.066 2 INFO neutron.agent.securitygroups_rpc [None req-8fea6781-8662-4097-8f9e-196265809203 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:23:10 localhost nova_compute[237281]: 2025-12-06 10:23:10.674 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:10 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:10.677 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:10 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:10.679 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:23:11 localhost sshd[257432]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:23:11 localhost nova_compute[237281]: 2025-12-06 10:23:11.233 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:11 localhost podman[257454]: Dec 6 05:23:11 localhost podman[257454]: 2025-12-06 10:23:11.293213097 +0000 UTC m=+0.071037760 container create e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:11 localhost systemd[1]: Started libpod-conmon-e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb.scope. Dec 6 05:23:11 localhost podman[257454]: 2025-12-06 10:23:11.25309135 +0000 UTC m=+0.030916033 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:11 localhost systemd[1]: Started libcrun container. Dec 6 05:23:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7093a5167a63b0bcb668d4fc864c19fe86a753445fbb524915f4ab86a40881a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:11 localhost podman[257454]: 2025-12-06 10:23:11.371138877 +0000 UTC m=+0.148963570 container init e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:11 localhost podman[257454]: 2025-12-06 10:23:11.38489827 +0000 UTC m=+0.162722933 container start e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:11 localhost dnsmasq[257473]: started, version 2.85 cachesize 150 Dec 6 05:23:11 localhost dnsmasq[257473]: DNS service limited to local subnets Dec 6 05:23:11 localhost dnsmasq[257473]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:11 localhost dnsmasq[257473]: warning: no upstream servers configured Dec 6 05:23:11 localhost dnsmasq-dhcp[257473]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:11 localhost dnsmasq-dhcp[257473]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 6 05:23:11 localhost dnsmasq[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:11 localhost dnsmasq-dhcp[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:11 localhost dnsmasq-dhcp[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:11.443 219384 INFO neutron.agent.dhcp.agent [None req-a0fc133e-aae4-4c62-9ff9-c6b00da60b27 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:23:09Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=a97a845e-a7f3-46d5-83f6-797b907985a8, ip_allocation=immediate, mac_address=fa:16:3e:fb:1b:8f, name=tempest-NetworksTestDHCPv6-1211550289, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['2c522552-f2c2-4686-afd9-64cafda0cb43', '99bb21d2-5471-405e-9a96-a0f0f3cf1984'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:23:04Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1718, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:23:09Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:23:11 localhost dnsmasq[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:23:11 localhost dnsmasq-dhcp[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:11 localhost dnsmasq-dhcp[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:11 localhost podman[257492]: 2025-12-06 10:23:11.646210589 +0000 UTC m=+0.067151870 container kill e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:23:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:23:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:23:12 localhost podman[257513]: 2025-12-06 10:23:12.304261695 +0000 UTC m=+0.085480864 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent) Dec 6 05:23:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:12.360 219384 INFO neutron.agent.dhcp.agent [None req-ed2cc9cc-0b0a-4847-a3e8-811a20767373 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '65ec618c-f9ca-4c65-9c09-a1cb72b37ac7'} is completed#033[00m Dec 6 05:23:12 localhost podman[257514]: 2025-12-06 10:23:12.364293384 +0000 UTC m=+0.140573441 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 05:23:12 localhost podman[257513]: 2025-12-06 10:23:12.388023215 +0000 UTC m=+0.169242364 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:23:12 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:23:12 localhost podman[257514]: 2025-12-06 10:23:12.408518346 +0000 UTC m=+0.184798413 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:23:12 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:23:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:12.591 219384 INFO neutron.agent.dhcp.agent [None req-b62c3ac5-ec2f-46c9-971d-ed7f42f390da - - - - - -] DHCP configuration for ports {'a97a845e-a7f3-46d5-83f6-797b907985a8'} is completed#033[00m Dec 6 05:23:12 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:12.600 2 INFO neutron.agent.securitygroups_rpc [None req-54768eeb-b7f5-491c-8afd-1bb3dc32852d 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:23:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:12.682 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:23:14 localhost nova_compute[237281]: 2025-12-06 10:23:14.103 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:14 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:14.135 2 INFO neutron.agent.securitygroups_rpc [None req-238d26d1-4e6f-4e2c-b578-d4fed57d9720 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:23:14 localhost dnsmasq[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:14 localhost podman[257562]: 2025-12-06 10:23:14.392300065 +0000 UTC m=+0.065364484 container kill e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:23:14 localhost dnsmasq-dhcp[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:14 localhost dnsmasq-dhcp[257473]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:14 localhost nova_compute[237281]: 2025-12-06 10:23:14.591 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:16 localhost openstack_network_exporter[199751]: ERROR 10:23:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:23:16 localhost openstack_network_exporter[199751]: ERROR 10:23:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:23:16 localhost openstack_network_exporter[199751]: ERROR 10:23:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:23:16 localhost openstack_network_exporter[199751]: ERROR 10:23:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:23:16 localhost openstack_network_exporter[199751]: Dec 6 05:23:16 localhost openstack_network_exporter[199751]: ERROR 10:23:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:23:16 localhost openstack_network_exporter[199751]: Dec 6 05:23:16 localhost nova_compute[237281]: 2025-12-06 10:23:16.235 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:16 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:16.830 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:17 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:17.250 2 INFO neutron.agent.securitygroups_rpc [None req-f7378788-8713-433a-9cea-c28343dfc00d 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:23:18 localhost dnsmasq[257473]: exiting on receipt of SIGTERM Dec 6 05:23:18 localhost podman[257599]: 2025-12-06 10:23:18.770839169 +0000 UTC m=+0.061625669 container kill e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:18 localhost systemd[1]: libpod-e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb.scope: Deactivated successfully. Dec 6 05:23:18 localhost podman[257612]: 2025-12-06 10:23:18.845585572 +0000 UTC m=+0.062016041 container died e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:23:18 localhost systemd[1]: tmp-crun.8zv2zd.mount: Deactivated successfully. Dec 6 05:23:18 localhost podman[257612]: 2025-12-06 10:23:18.890067132 +0000 UTC m=+0.106497551 container cleanup e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:18 localhost systemd[1]: libpod-conmon-e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb.scope: Deactivated successfully. Dec 6 05:23:18 localhost podman[257614]: 2025-12-06 10:23:18.934750987 +0000 UTC m=+0.138141325 container remove e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:23:19 localhost nova_compute[237281]: 2025-12-06 10:23:19.141 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:19 localhost systemd[1]: var-lib-containers-storage-overlay-7093a5167a63b0bcb668d4fc864c19fe86a753445fbb524915f4ab86a40881a3-merged.mount: Deactivated successfully. Dec 6 05:23:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7db343040331a22824ee32865fcad9f47560871f305378fe3753bb38a42c0cb-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:19 localhost podman[257692]: Dec 6 05:23:19 localhost podman[257692]: 2025-12-06 10:23:19.840984809 +0000 UTC m=+0.088596150 container create 597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:23:19 localhost systemd[1]: Started libpod-conmon-597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8.scope. Dec 6 05:23:19 localhost systemd[1]: Started libcrun container. Dec 6 05:23:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e27777b90d1d82c3a71fdcebb38c999daeb4a7dc064d5350665c794ad4a649/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:19 localhost podman[257692]: 2025-12-06 10:23:19.803885616 +0000 UTC m=+0.051496957 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:19 localhost podman[257692]: 2025-12-06 10:23:19.907490627 +0000 UTC m=+0.155101968 container init 597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:23:19 localhost podman[257692]: 2025-12-06 10:23:19.91506695 +0000 UTC m=+0.162678291 container start 597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:23:19 localhost dnsmasq[257710]: started, version 2.85 cachesize 150 Dec 6 05:23:19 localhost dnsmasq[257710]: DNS service limited to local subnets Dec 6 05:23:19 localhost dnsmasq[257710]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:19 localhost dnsmasq[257710]: warning: no upstream servers configured Dec 6 05:23:19 localhost dnsmasq-dhcp[257710]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:19 localhost dnsmasq[257710]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:19 localhost dnsmasq-dhcp[257710]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:19 localhost dnsmasq-dhcp[257710]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:23:20 localhost sshd[257712]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:23:20 localhost podman[257711]: 2025-12-06 10:23:20.815126372 +0000 UTC m=+0.094065679 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:23:20 localhost podman[257711]: 2025-12-06 10:23:20.859270481 +0000 UTC m=+0.138209768 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 05:23:20 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:23:21 localhost nova_compute[237281]: 2025-12-06 10:23:21.238 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:21 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:21.722 219384 INFO neutron.agent.dhcp.agent [None req-09024546-985b-4521-b510-674b883263b2 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '65ec618c-f9ca-4c65-9c09-a1cb72b37ac7'} is completed#033[00m Dec 6 05:23:21 localhost dnsmasq[257710]: exiting on receipt of SIGTERM Dec 6 05:23:21 localhost podman[257750]: 2025-12-06 10:23:21.900443158 +0000 UTC m=+0.062056802 container kill 597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:23:21 localhost systemd[1]: libpod-597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8.scope: Deactivated successfully. Dec 6 05:23:21 localhost podman[257763]: 2025-12-06 10:23:21.974944843 +0000 UTC m=+0.059467133 container died 597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:22 localhost podman[257763]: 2025-12-06 10:23:22.009831717 +0000 UTC m=+0.094353957 container cleanup 597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:23:22 localhost systemd[1]: libpod-conmon-597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8.scope: Deactivated successfully. Dec 6 05:23:22 localhost ovn_controller[131684]: 2025-12-06T10:23:22Z|00310|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:23:22 localhost nova_compute[237281]: 2025-12-06 10:23:22.059 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:22 localhost podman[257765]: 2025-12-06 10:23:22.060153567 +0000 UTC m=+0.136364292 container remove 597a8cf6de56991022013b5c73359e46a0d98bc853eb6da7761feddda82088d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:22 localhost ovn_controller[131684]: 2025-12-06T10:23:22Z|00311|binding|INFO|Releasing lport 65ec618c-f9ca-4c65-9c09-a1cb72b37ac7 from this chassis (sb_readonly=0) Dec 6 05:23:22 localhost ovn_controller[131684]: 2025-12-06T10:23:22Z|00312|binding|INFO|Setting lport 65ec618c-f9ca-4c65-9c09-a1cb72b37ac7 down in Southbound Dec 6 05:23:22 localhost nova_compute[237281]: 2025-12-06 10:23:22.072 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:22 localhost kernel: device tap65ec618c-f9 left promiscuous mode Dec 6 05:23:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:22.096 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fe34:1ab3/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=65ec618c-f9ca-4c65-9c09-a1cb72b37ac7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:22 localhost nova_compute[237281]: 2025-12-06 10:23:22.098 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:22.098 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 65ec618c-f9ca-4c65-9c09-a1cb72b37ac7 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:23:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:22.102 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:22 localhost nova_compute[237281]: 2025-12-06 10:23:22.102 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:22.103 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b821240a-57e9-410f-a6f7-fbfefa650e42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:22 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:22.191 2 INFO neutron.agent.securitygroups_rpc [None req-35b39913-9252-4627-be85-aa132616c5b1 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:23:22 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:22.825 2 INFO neutron.agent.securitygroups_rpc [None req-35ff2c02-b7f7-464e-8192-2b22b1931571 f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:23:22 localhost systemd[1]: var-lib-containers-storage-overlay-95e27777b90d1d82c3a71fdcebb38c999daeb4a7dc064d5350665c794ad4a649-merged.mount: Deactivated successfully. Dec 6 05:23:22 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:23:23 localhost podman[197801]: time="2025-12-06T10:23:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:23:23 localhost podman[197801]: @ - - [06/Dec/2025:10:23:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:23:23 localhost podman[197801]: @ - - [06/Dec/2025:10:23:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15972 "" "Go-http-client/1.1" Dec 6 05:23:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9823 DF PROTO=TCP SPT=36902 DPT=9102 SEQ=3511509215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE4D6A0000000001030307) Dec 6 05:23:24 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:24.066 2 INFO neutron.agent.securitygroups_rpc [None req-a693ee86-97ae-40bd-bb85-e859a18e2dfd 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:23:24 localhost nova_compute[237281]: 2025-12-06 10:23:24.176 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:23:24 localhost systemd[1]: tmp-crun.HnoJo5.mount: Deactivated successfully. Dec 6 05:23:24 localhost podman[257793]: 2025-12-06 10:23:24.571733542 +0000 UTC m=+0.099551788 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:23:24 localhost podman[257793]: 2025-12-06 10:23:24.578992255 +0000 UTC m=+0.106810521 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:23:24 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:23:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9824 DF PROTO=TCP SPT=36902 DPT=9102 SEQ=3511509215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE51880000000001030307) Dec 6 05:23:25 localhost ovn_controller[131684]: 2025-12-06T10:23:25Z|00313|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:23:25 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:25.510 219384 INFO neutron.agent.linux.ip_lib [None req-3a26ec3f-5c74-40af-aeed-fd9fec72129b - - - - - -] Device tapfc5412c7-64 cannot be used as it has no MAC address#033[00m Dec 6 05:23:25 localhost nova_compute[237281]: 2025-12-06 10:23:25.548 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:25 localhost nova_compute[237281]: 2025-12-06 10:23:25.550 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:25 localhost kernel: device tapfc5412c7-64 entered promiscuous mode Dec 6 05:23:25 localhost NetworkManager[5965]: [1765016605.5589] manager: (tapfc5412c7-64): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Dec 6 05:23:25 localhost ovn_controller[131684]: 2025-12-06T10:23:25Z|00314|binding|INFO|Claiming lport fc5412c7-640b-4047-9d2b-bdde60063dc8 for this chassis. Dec 6 05:23:25 localhost ovn_controller[131684]: 2025-12-06T10:23:25Z|00315|binding|INFO|fc5412c7-640b-4047-9d2b-bdde60063dc8: Claiming unknown Dec 6 05:23:25 localhost nova_compute[237281]: 2025-12-06 10:23:25.562 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6148 DF PROTO=TCP SPT=50680 DPT=9102 SEQ=3014205041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE53870000000001030307) Dec 6 05:23:25 localhost systemd-udevd[257827]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:23:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:25.573 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5412c7-640b-4047-9d2b-bdde60063dc8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:25.575 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fc5412c7-640b-4047-9d2b-bdde60063dc8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:23:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:25.578 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 589bef26-590b-4015-afed-760f0fc7ba80 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:23:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:25.578 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:25.579 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6c1fa6-191b-440e-af2d-ac83a3993e7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost ovn_controller[131684]: 2025-12-06T10:23:25Z|00316|binding|INFO|Setting lport fc5412c7-640b-4047-9d2b-bdde60063dc8 ovn-installed in OVS Dec 6 05:23:25 localhost ovn_controller[131684]: 2025-12-06T10:23:25Z|00317|binding|INFO|Setting lport fc5412c7-640b-4047-9d2b-bdde60063dc8 up in Southbound Dec 6 05:23:25 localhost nova_compute[237281]: 2025-12-06 10:23:25.600 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:25 localhost nova_compute[237281]: 2025-12-06 10:23:25.636 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:25 localhost nova_compute[237281]: 2025-12-06 10:23:25.661 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:26 localhost nova_compute[237281]: 2025-12-06 10:23:26.240 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:26 localhost podman[257898]: Dec 6 05:23:26 localhost podman[257898]: 2025-12-06 10:23:26.405563911 +0000 UTC m=+0.079109827 container create ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 05:23:26 localhost systemd[1]: Started libpod-conmon-ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a.scope. Dec 6 05:23:26 localhost systemd[1]: Started libcrun container. Dec 6 05:23:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d631d766b521a09f6af6d1df8d7c3fe8556e866bd57517df51574ba5f7b6fcb1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:26 localhost podman[257898]: 2025-12-06 10:23:26.369919624 +0000 UTC m=+0.043465550 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:26 localhost podman[257898]: 2025-12-06 10:23:26.474591168 +0000 UTC m=+0.148137124 container init ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:26 localhost podman[257898]: 2025-12-06 10:23:26.483516102 +0000 UTC m=+0.157062108 container start ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:23:26 localhost dnsmasq[257915]: started, version 2.85 cachesize 150 Dec 6 05:23:26 localhost dnsmasq[257915]: DNS service limited to local subnets Dec 6 05:23:26 localhost dnsmasq[257915]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:26 localhost dnsmasq[257915]: warning: no upstream servers configured Dec 6 05:23:26 localhost dnsmasq-dhcp[257915]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:26 localhost dnsmasq[257915]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:26 localhost dnsmasq-dhcp[257915]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:26 localhost dnsmasq-dhcp[257915]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:26 localhost ovn_controller[131684]: 2025-12-06T10:23:26Z|00318|binding|INFO|Releasing lport fc5412c7-640b-4047-9d2b-bdde60063dc8 from this chassis (sb_readonly=0) Dec 6 05:23:26 localhost kernel: device tapfc5412c7-64 left promiscuous mode Dec 6 05:23:26 localhost nova_compute[237281]: 2025-12-06 10:23:26.552 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:26 localhost ovn_controller[131684]: 2025-12-06T10:23:26Z|00319|binding|INFO|Setting lport fc5412c7-640b-4047-9d2b-bdde60063dc8 down in Southbound Dec 6 05:23:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:26.567 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5412c7-640b-4047-9d2b-bdde60063dc8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:26 localhost nova_compute[237281]: 2025-12-06 10:23:26.570 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:26.572 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fc5412c7-640b-4047-9d2b-bdde60063dc8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:23:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:26.575 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:26.576 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5c2d7b-b603-4017-a332-2cb77d0e6cf7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:26.662 219384 INFO neutron.agent.dhcp.agent [None req-3058903a-4ccf-47e1-8e7d-7c0d0c3889aa - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:23:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9825 DF PROTO=TCP SPT=36902 DPT=9102 SEQ=3511509215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE59870000000001030307) Dec 6 05:23:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15668 DF PROTO=TCP SPT=52728 DPT=9102 SEQ=3354546070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE5D870000000001030307) Dec 6 05:23:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:28.400 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:91:81 2001:db8:0:1:f816:3eff:fe4e:9181'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3ba22961-ba53-4fab-b867-7a59008889f5) old=Port_Binding(mac=['fa:16:3e:4e:91:81 2001:db8::f816:3eff:fe4e:9181'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:9181/64', 'neutron:device_id': 'ovnmeta-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:28.402 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3ba22961-ba53-4fab-b867-7a59008889f5 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a updated#033[00m Dec 6 05:23:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:28.404 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:28 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:28.405 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0418fa-ea81-4bd2-8b11-568ef1264ef7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:29 localhost nova_compute[237281]: 2025-12-06 10:23:29.179 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:30 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:30.182 2 INFO neutron.agent.securitygroups_rpc [None req-693cc745-73f2-446b-98f0-2893f625555e f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:23:30 localhost dnsmasq[257915]: exiting on receipt of SIGTERM Dec 6 05:23:30 localhost podman[257935]: 2025-12-06 10:23:30.659938343 +0000 UTC m=+0.056640335 container kill ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:23:30 localhost systemd[1]: libpod-ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a.scope: Deactivated successfully. Dec 6 05:23:30 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:30.692 2 INFO neutron.agent.securitygroups_rpc [None req-d0cd8160-6079-4d2c-a7e8-53c916852e44 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:30 localhost podman[257948]: 2025-12-06 10:23:30.734484199 +0000 UTC m=+0.060647359 container died ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:23:30 localhost systemd[1]: tmp-crun.zvPWCP.mount: Deactivated successfully. Dec 6 05:23:30 localhost podman[257948]: 2025-12-06 10:23:30.775279385 +0000 UTC m=+0.101442475 container cleanup ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:30 localhost systemd[1]: libpod-conmon-ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a.scope: Deactivated successfully. Dec 6 05:23:30 localhost podman[257950]: 2025-12-06 10:23:30.801810773 +0000 UTC m=+0.120555274 container remove ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:30.851 219384 INFO neutron.agent.linux.ip_lib [None req-0ccdd223-961e-4134-b2aa-58c4604674b2 - - - - - -] Device tapfc5412c7-64 cannot be used as it has no MAC address#033[00m Dec 6 05:23:30 localhost nova_compute[237281]: 2025-12-06 10:23:30.910 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:30 localhost kernel: device tapfc5412c7-64 entered promiscuous mode Dec 6 05:23:30 localhost ovn_controller[131684]: 2025-12-06T10:23:30Z|00320|binding|INFO|Claiming lport fc5412c7-640b-4047-9d2b-bdde60063dc8 for this chassis. Dec 6 05:23:30 localhost NetworkManager[5965]: [1765016610.9172] manager: (tapfc5412c7-64): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Dec 6 05:23:30 localhost ovn_controller[131684]: 2025-12-06T10:23:30Z|00321|binding|INFO|fc5412c7-640b-4047-9d2b-bdde60063dc8: Claiming unknown Dec 6 05:23:30 localhost nova_compute[237281]: 2025-12-06 10:23:30.918 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:30 localhost systemd-udevd[257983]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:23:30 localhost ovn_controller[131684]: 2025-12-06T10:23:30Z|00322|binding|INFO|Setting lport fc5412c7-640b-4047-9d2b-bdde60063dc8 ovn-installed in OVS Dec 6 05:23:30 localhost nova_compute[237281]: 2025-12-06 10:23:30.927 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:30 localhost nova_compute[237281]: 2025-12-06 10:23:30.930 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:30 localhost ovn_controller[131684]: 2025-12-06T10:23:30Z|00323|binding|INFO|Setting lport fc5412c7-640b-4047-9d2b-bdde60063dc8 up in Southbound Dec 6 05:23:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:30.935 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe21:bbc4/64 2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5412c7-640b-4047-9d2b-bdde60063dc8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:30.939 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fc5412c7-640b-4047-9d2b-bdde60063dc8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:23:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:30.943 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 589bef26-590b-4015-afed-760f0fc7ba80 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:23:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:30.943 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:30.944 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5614d6ba-9705-435a-a713-e143284868b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:30 localhost nova_compute[237281]: 2025-12-06 10:23:30.960 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:30 localhost journal[186952]: ethtool ioctl error on tapfc5412c7-64: No such device Dec 6 05:23:31 localhost nova_compute[237281]: 2025-12-06 10:23:31.001 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:31 localhost nova_compute[237281]: 2025-12-06 10:23:31.029 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9826 DF PROTO=TCP SPT=36902 DPT=9102 SEQ=3511509215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE69480000000001030307) Dec 6 05:23:31 localhost nova_compute[237281]: 2025-12-06 10:23:31.243 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:31 localhost systemd[1]: var-lib-containers-storage-overlay-d631d766b521a09f6af6d1df8d7c3fe8556e866bd57517df51574ba5f7b6fcb1-merged.mount: Deactivated successfully. Dec 6 05:23:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee9b1847c1b6b52457ceaa3f23a35441b4f0e3783609b63ba17806d31c38fe7a-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:31 localhost podman[258054]: Dec 6 05:23:31 localhost podman[258054]: 2025-12-06 10:23:31.825286655 +0000 UTC m=+0.104739057 container create e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:31 localhost podman[258054]: 2025-12-06 10:23:31.768691302 +0000 UTC m=+0.048143704 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:31 localhost systemd[1]: Started libpod-conmon-e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0.scope. Dec 6 05:23:31 localhost systemd[1]: Started libcrun container. Dec 6 05:23:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c650782e5240b78d2d427f222b89b1bdc84edf6f5eb6f521c0b4ba970e1d4204/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:31 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:31.898 2 INFO neutron.agent.securitygroups_rpc [None req-7ade3f66-de93-4ab2-95dd-2425408ea8dd f60b758b7db74f959e7135e2eb1f62db b415850c5b2b4306b771adda1881b474 - - default default] Security group member updated ['217ecf3e-dbf5-427c-a24f-01db6b2f0c37']#033[00m Dec 6 05:23:31 localhost podman[258054]: 2025-12-06 10:23:31.902789472 +0000 UTC m=+0.182241874 container init e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:23:31 localhost dnsmasq[258073]: started, version 2.85 cachesize 150 Dec 6 05:23:31 localhost dnsmasq[258073]: DNS service limited to local subnets Dec 6 05:23:31 localhost dnsmasq[258073]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:31 localhost dnsmasq[258073]: warning: no upstream servers configured Dec 6 05:23:31 localhost dnsmasq-dhcp[258073]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:31 localhost dnsmasq[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:31 localhost dnsmasq-dhcp[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:31 localhost dnsmasq-dhcp[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:31 localhost podman[258054]: 2025-12-06 10:23:31.928535515 +0000 UTC m=+0.207987907 container start e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:23:32 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:32.228 2 INFO neutron.agent.securitygroups_rpc [None req-587716c6-dbeb-47b4-8f89-f42c7950a4f7 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:32.328 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:32.369 219384 INFO neutron.agent.dhcp.agent [None req-bfa1d246-1aa6-4a2a-a7ae-57b52f40f596 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', 'fc5412c7-640b-4047-9d2b-bdde60063dc8'} is completed#033[00m Dec 6 05:23:32 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:32.547 2 INFO neutron.agent.securitygroups_rpc [None req-7ade3f66-de93-4ab2-95dd-2425408ea8dd f60b758b7db74f959e7135e2eb1f62db b415850c5b2b4306b771adda1881b474 - - default default] Security group member updated ['217ecf3e-dbf5-427c-a24f-01db6b2f0c37']#033[00m Dec 6 05:23:32 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:32.565 2 INFO neutron.agent.securitygroups_rpc [None req-bacb7813-88f9-4747-9f21-8f96788b9259 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:23:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:32.797 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:23:31Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=30661ffa-3ac7-4193-98eb-24059f1f3029, ip_allocation=immediate, mac_address=fa:16:3e:89:21:10, name=tempest-NetworksTestDHCPv6-272173492, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['bd16d669-beca-42c6-84a1-e05b51e52ed8', 'dcd710b3-f642-4a70-9781-c8f2f7c3e53d'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:23:25Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1765, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:23:32Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:23:32 localhost dnsmasq[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:23:32 localhost dnsmasq-dhcp[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:32 localhost podman[258092]: 2025-12-06 10:23:32.994799124 +0000 UTC m=+0.062093173 container kill e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:32 localhost dnsmasq-dhcp[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:23:33 localhost podman[258112]: 2025-12-06 10:23:33.306948029 +0000 UTC m=+0.064868579 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:23:33 localhost podman[258112]: 2025-12-06 10:23:33.350076927 +0000 UTC m=+0.107997487 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:33 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:23:33 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:33.507 219384 INFO neutron.agent.dhcp.agent [None req-fc38db13-33ca-459b-abe5-2bd0e3d87281 - - - - - -] DHCP configuration for ports {'30661ffa-3ac7-4193-98eb-24059f1f3029'} is completed#033[00m Dec 6 05:23:34 localhost nova_compute[237281]: 2025-12-06 10:23:34.182 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:34 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:34.857 2 INFO neutron.agent.securitygroups_rpc [None req-60a4c70f-e7e0-4854-97b5-30d2349f21b1 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:35 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:35.561 2 INFO neutron.agent.securitygroups_rpc [None req-1e40b915-16d6-4bb2-b69f-cea68c2bf731 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:23:35 localhost podman[258156]: 2025-12-06 10:23:35.755205633 +0000 UTC m=+0.053519519 container kill e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:23:35 localhost dnsmasq[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:35 localhost dnsmasq-dhcp[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:35 localhost systemd[1]: tmp-crun.jX6Ths.mount: Deactivated successfully. Dec 6 05:23:35 localhost dnsmasq-dhcp[258073]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:36 localhost nova_compute[237281]: 2025-12-06 10:23:36.246 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:36 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:36.339 2 INFO neutron.agent.securitygroups_rpc [None req-73892d9b-bf39-4448-9275-f85cff420b13 f60b758b7db74f959e7135e2eb1f62db b415850c5b2b4306b771adda1881b474 - - default default] Security group member updated ['217ecf3e-dbf5-427c-a24f-01db6b2f0c37']#033[00m Dec 6 05:23:36 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:36.788 2 INFO neutron.agent.securitygroups_rpc [None req-29219d16-fb10-4f8e-8f97-16be3ec17cc6 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:36 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:36.820 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:37 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:37.368 2 INFO neutron.agent.securitygroups_rpc [None req-40fcc59d-8549-4f6c-8b07-5c8003032eb2 f60b758b7db74f959e7135e2eb1f62db b415850c5b2b4306b771adda1881b474 - - default default] Security group member updated ['217ecf3e-dbf5-427c-a24f-01db6b2f0c37']#033[00m Dec 6 05:23:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:37.395 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:23:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:23:37 localhost dnsmasq[258073]: exiting on receipt of SIGTERM Dec 6 05:23:37 localhost podman[258194]: 2025-12-06 10:23:37.5071238 +0000 UTC m=+0.064785617 container kill e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 05:23:37 localhost systemd[1]: libpod-e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0.scope: Deactivated successfully. Dec 6 05:23:37 localhost systemd[1]: tmp-crun.FbdKEB.mount: Deactivated successfully. Dec 6 05:23:37 localhost podman[258204]: 2025-12-06 10:23:37.570058018 +0000 UTC m=+0.100987761 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:23:37 localhost podman[258226]: 2025-12-06 10:23:37.583320266 +0000 UTC m=+0.060226855 container died e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:23:37 localhost podman[258205]: 2025-12-06 10:23:37.619620034 +0000 UTC m=+0.144136360 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:37 localhost podman[258204]: 2025-12-06 10:23:37.630226971 +0000 UTC m=+0.161156774 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:23:37 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:23:37 localhost podman[258226]: 2025-12-06 10:23:37.658624556 +0000 UTC m=+0.135531135 container cleanup e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:37 localhost systemd[1]: libpod-conmon-e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0.scope: Deactivated successfully. Dec 6 05:23:37 localhost podman[258227]: 2025-12-06 10:23:37.682497121 +0000 UTC m=+0.155121549 container remove e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:37 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:37.695 2 INFO neutron.agent.securitygroups_rpc [None req-7fc2d29c-e04c-4929-b70a-07e07aeef024 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:37 localhost podman[258205]: 2025-12-06 10:23:37.732839302 +0000 UTC m=+0.257355638 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Dec 6 05:23:37 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:23:37 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:37.937 2 INFO neutron.agent.securitygroups_rpc [None req-04232c1f-dd68-46f0-9328-494c51197809 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:23:38 localhost systemd[1]: var-lib-containers-storage-overlay-c650782e5240b78d2d427f222b89b1bdc84edf6f5eb6f521c0b4ba970e1d4204-merged.mount: Deactivated successfully. Dec 6 05:23:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1f302186ac15a448de4e85f11ecdf2665b253c27cb3febb0bacd996237860a0-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:38 localhost podman[258329]: Dec 6 05:23:38 localhost podman[258329]: 2025-12-06 10:23:38.519243752 +0000 UTC m=+0.088637361 container create 375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:23:38 localhost systemd[1]: Started libpod-conmon-375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3.scope. Dec 6 05:23:38 localhost systemd[1]: Started libcrun container. Dec 6 05:23:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eec520c087ad7ab7f08ed81155b309bbdf2155e52892f830c13a1a4754b5b584/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:38 localhost podman[258329]: 2025-12-06 10:23:38.475954039 +0000 UTC m=+0.045347688 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:38 localhost podman[258329]: 2025-12-06 10:23:38.582037536 +0000 UTC m=+0.151431155 container init 375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:23:38 localhost podman[258329]: 2025-12-06 10:23:38.591460036 +0000 UTC m=+0.160853645 container start 375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:38 localhost dnsmasq[258347]: started, version 2.85 cachesize 150 Dec 6 05:23:38 localhost dnsmasq[258347]: DNS service limited to local subnets Dec 6 05:23:38 localhost dnsmasq[258347]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:38 localhost dnsmasq[258347]: warning: no upstream servers configured Dec 6 05:23:38 localhost dnsmasq-dhcp[258347]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:38 localhost dnsmasq[258347]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:38 localhost dnsmasq-dhcp[258347]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:38 localhost dnsmasq-dhcp[258347]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:38.864 219384 INFO neutron.agent.dhcp.agent [None req-611ef86f-7c6c-4622-b56b-b06cbc9df07f - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', 'fc5412c7-640b-4047-9d2b-bdde60063dc8'} is completed#033[00m Dec 6 05:23:38 localhost dnsmasq[258347]: exiting on receipt of SIGTERM Dec 6 05:23:38 localhost podman[258366]: 2025-12-06 10:23:38.970555802 +0000 UTC m=+0.062476256 container kill 375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:23:38 localhost systemd[1]: libpod-375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3.scope: Deactivated successfully. Dec 6 05:23:39 localhost podman[258379]: 2025-12-06 10:23:39.026550917 +0000 UTC m=+0.042824971 container died 375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:23:39 localhost podman[258379]: 2025-12-06 10:23:39.063256417 +0000 UTC m=+0.079530411 container cleanup 375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:23:39 localhost systemd[1]: libpod-conmon-375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3.scope: Deactivated successfully. Dec 6 05:23:39 localhost podman[258380]: 2025-12-06 10:23:39.110863304 +0000 UTC m=+0.119400179 container remove 375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 6 05:23:39 localhost nova_compute[237281]: 2025-12-06 10:23:39.123 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:39 localhost kernel: device tapfc5412c7-64 left promiscuous mode Dec 6 05:23:39 localhost ovn_controller[131684]: 2025-12-06T10:23:39Z|00324|binding|INFO|Releasing lport fc5412c7-640b-4047-9d2b-bdde60063dc8 from this chassis (sb_readonly=0) Dec 6 05:23:39 localhost ovn_controller[131684]: 2025-12-06T10:23:39Z|00325|binding|INFO|Setting lport fc5412c7-640b-4047-9d2b-bdde60063dc8 down in Southbound Dec 6 05:23:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:39.138 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe21:bbc4/64 2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fc5412c7-640b-4047-9d2b-bdde60063dc8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:39.140 137259 INFO neutron.agent.ovn.metadata.agent [-] Port fc5412c7-640b-4047-9d2b-bdde60063dc8 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:23:39 localhost nova_compute[237281]: 2025-12-06 10:23:39.141 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:39.143 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:39.144 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c65f1c4f-19da-43c3-a80a-12ab21936b18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:39 localhost nova_compute[237281]: 2025-12-06 10:23:39.183 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9827 DF PROTO=TCP SPT=36902 DPT=9102 SEQ=3511509215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDE89870000000001030307) Dec 6 05:23:39 localhost systemd[1]: var-lib-containers-storage-overlay-eec520c087ad7ab7f08ed81155b309bbdf2155e52892f830c13a1a4754b5b584-merged.mount: Deactivated successfully. Dec 6 05:23:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-375817436f06725fec2060313d31a2e0ded868f34e5ce5a28d9ab095ace632d3-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:40 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:23:40 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:40.815 2 INFO neutron.agent.securitygroups_rpc [None req-67d22c65-f001-43c3-acc8-54e9823eaaac 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:40.908 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:40.988 219384 INFO neutron.agent.linux.ip_lib [None req-657e82f7-e391-4302-b812-75691cc1ea84 - - - - - -] Device tap4a100b7a-c7 cannot be used as it has no MAC address#033[00m Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.054 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost kernel: device tap4a100b7a-c7 entered promiscuous mode Dec 6 05:23:41 localhost ovn_controller[131684]: 2025-12-06T10:23:41Z|00326|binding|INFO|Claiming lport 4a100b7a-c73f-4755-9e86-cb876d010696 for this chassis. Dec 6 05:23:41 localhost ovn_controller[131684]: 2025-12-06T10:23:41Z|00327|binding|INFO|4a100b7a-c73f-4755-9e86-cb876d010696: Claiming unknown Dec 6 05:23:41 localhost NetworkManager[5965]: [1765016621.0641] manager: (tap4a100b7a-c7): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.064 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost systemd-udevd[258416]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:23:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:41.074 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4a100b7a-c73f-4755-9e86-cb876d010696) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:41 localhost ovn_controller[131684]: 2025-12-06T10:23:41Z|00328|binding|INFO|Setting lport 4a100b7a-c73f-4755-9e86-cb876d010696 ovn-installed in OVS Dec 6 05:23:41 localhost ovn_controller[131684]: 2025-12-06T10:23:41Z|00329|binding|INFO|Setting lport 4a100b7a-c73f-4755-9e86-cb876d010696 up in Southbound Dec 6 05:23:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:41.076 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 4a100b7a-c73f-4755-9e86-cb876d010696 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.077 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.079 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:41.078 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port a80ab8e5-a60a-4329-9b00-a45a6710a7aa IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:23:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:41.079 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:41.080 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a9026198-ba96-4df4-8269-a1ad28cf5ba1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.113 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.157 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.187 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost nova_compute[237281]: 2025-12-06 10:23:41.248 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:41 localhost podman[258487]: Dec 6 05:23:41 localhost podman[258487]: 2025-12-06 10:23:41.968668062 +0000 UTC m=+0.086463215 container create 853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:42 localhost systemd[1]: Started libpod-conmon-853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5.scope. Dec 6 05:23:42 localhost systemd[1]: Started libcrun container. Dec 6 05:23:42 localhost podman[258487]: 2025-12-06 10:23:41.926351788 +0000 UTC m=+0.044146971 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3b3ad3c97857eb25e818c9322da876b09870d00e7652afd7593b28f5b597bbf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:42 localhost podman[258487]: 2025-12-06 10:23:42.033869899 +0000 UTC m=+0.151665042 container init 853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:42 localhost podman[258487]: 2025-12-06 10:23:42.042614989 +0000 UTC m=+0.160410142 container start 853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:23:42 localhost dnsmasq[258505]: started, version 2.85 cachesize 150 Dec 6 05:23:42 localhost dnsmasq[258505]: DNS service limited to local subnets Dec 6 05:23:42 localhost dnsmasq[258505]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:42 localhost dnsmasq[258505]: warning: no upstream servers configured Dec 6 05:23:42 localhost dnsmasq-dhcp[258505]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:42 localhost dnsmasq[258505]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:42 localhost dnsmasq-dhcp[258505]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:42 localhost dnsmasq-dhcp[258505]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:42 localhost nova_compute[237281]: 2025-12-06 10:23:42.185 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:42 localhost kernel: device tap4a100b7a-c7 left promiscuous mode Dec 6 05:23:42 localhost ovn_controller[131684]: 2025-12-06T10:23:42Z|00330|binding|INFO|Releasing lport 4a100b7a-c73f-4755-9e86-cb876d010696 from this chassis (sb_readonly=0) Dec 6 05:23:42 localhost ovn_controller[131684]: 2025-12-06T10:23:42Z|00331|binding|INFO|Setting lport 4a100b7a-c73f-4755-9e86-cb876d010696 down in Southbound Dec 6 05:23:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:42.200 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4a100b7a-c73f-4755-9e86-cb876d010696) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:42.202 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 4a100b7a-c73f-4755-9e86-cb876d010696 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:23:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:42.205 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:42.206 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c802610e-cdd6-4a99-8d20-a7b5fb3bf48d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:42 localhost nova_compute[237281]: 2025-12-06 10:23:42.215 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:42.247 219384 INFO neutron.agent.dhcp.agent [None req-c34b3561-370d-4282-a97a-2da23afcfb44 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5'} is completed#033[00m Dec 6 05:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:23:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:23:42 localhost podman[258509]: 2025-12-06 10:23:42.541941548 +0000 UTC m=+0.073971879 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:23:42 localhost podman[258509]: 2025-12-06 10:23:42.574271264 +0000 UTC m=+0.106301575 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 6 05:23:42 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:23:42 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:42.591 2 INFO neutron.agent.securitygroups_rpc [None req-7ddea2b8-ca6d-468b-8b60-e84cabe42f73 f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:23:42 localhost podman[258510]: 2025-12-06 10:23:42.663509652 +0000 UTC m=+0.190456767 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125) Dec 6 05:23:42 localhost podman[258510]: 2025-12-06 10:23:42.675986706 +0000 UTC m=+0.202933781 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:23:42 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:23:43 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:43.591 2 INFO neutron.agent.securitygroups_rpc [None req-52502a9b-6c93-49f9-9827-5dbb328ceb60 f618f374afde4343ba53286161ba5ec6 7445feb682a34a189b4a8ce856532376 - - default default] Security group member updated ['db77da59-7505-46d0-bbe6-666c35195446']#033[00m Dec 6 05:23:44 localhost nova_compute[237281]: 2025-12-06 10:23:44.185 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:46 localhost openstack_network_exporter[199751]: ERROR 10:23:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:23:46 localhost openstack_network_exporter[199751]: ERROR 10:23:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:23:46 localhost openstack_network_exporter[199751]: ERROR 10:23:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:23:46 localhost openstack_network_exporter[199751]: ERROR 10:23:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:23:46 localhost openstack_network_exporter[199751]: Dec 6 05:23:46 localhost openstack_network_exporter[199751]: ERROR 10:23:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:23:46 localhost openstack_network_exporter[199751]: Dec 6 05:23:46 localhost nova_compute[237281]: 2025-12-06 10:23:46.251 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:47 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:47.275 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:47 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:47.489 2 INFO neutron.agent.securitygroups_rpc [None req-459d55d3-7e33-4abf-a5fc-24034f42ed1f 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:23:47 localhost dnsmasq[258505]: exiting on receipt of SIGTERM Dec 6 05:23:47 localhost systemd[1]: libpod-853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5.scope: Deactivated successfully. Dec 6 05:23:47 localhost podman[258564]: 2025-12-06 10:23:47.5164809 +0000 UTC m=+0.064026083 container kill 853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:23:47 localhost podman[258578]: 2025-12-06 10:23:47.594199254 +0000 UTC m=+0.062575808 container died 853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:47 localhost podman[258578]: 2025-12-06 10:23:47.633653239 +0000 UTC m=+0.102029723 container cleanup 853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:23:47 localhost systemd[1]: libpod-conmon-853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5.scope: Deactivated successfully. Dec 6 05:23:47 localhost podman[258580]: 2025-12-06 10:23:47.67137248 +0000 UTC m=+0.131664286 container remove 853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:23:47 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:47.730 219384 INFO neutron.agent.linux.ip_lib [None req-6505c986-94fb-45f7-b782-f1e3c7e8bebe - - - - - -] Device tap4a100b7a-c7 cannot be used as it has no MAC address#033[00m Dec 6 05:23:47 localhost nova_compute[237281]: 2025-12-06 10:23:47.800 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:47 localhost kernel: device tap4a100b7a-c7 entered promiscuous mode Dec 6 05:23:47 localhost NetworkManager[5965]: [1765016627.8082] manager: (tap4a100b7a-c7): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Dec 6 05:23:47 localhost nova_compute[237281]: 2025-12-06 10:23:47.810 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:47 localhost ovn_controller[131684]: 2025-12-06T10:23:47Z|00332|binding|INFO|Claiming lport 4a100b7a-c73f-4755-9e86-cb876d010696 for this chassis. Dec 6 05:23:47 localhost ovn_controller[131684]: 2025-12-06T10:23:47Z|00333|binding|INFO|4a100b7a-c73f-4755-9e86-cb876d010696: Claiming unknown Dec 6 05:23:47 localhost systemd-udevd[258612]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:23:47 localhost ovn_controller[131684]: 2025-12-06T10:23:47Z|00334|binding|INFO|Setting lport 4a100b7a-c73f-4755-9e86-cb876d010696 ovn-installed in OVS Dec 6 05:23:47 localhost nova_compute[237281]: 2025-12-06 10:23:47.822 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:47 localhost ovn_controller[131684]: 2025-12-06T10:23:47Z|00335|binding|INFO|Setting lport 4a100b7a-c73f-4755-9e86-cb876d010696 up in Southbound Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:47.833 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe11:631c/64 2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4a100b7a-c73f-4755-9e86-cb876d010696) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:47.835 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 4a100b7a-c73f-4755-9e86-cb876d010696 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a bound to our chassis#033[00m Dec 6 05:23:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:47.838 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port a80ab8e5-a60a-4329-9b00-a45a6710a7aa IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:23:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:47.838 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:47 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:47.839 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c7872777-ab94-4c76-b06f-822cc13caa46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:47 localhost nova_compute[237281]: 2025-12-06 10:23:47.845 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost journal[186952]: ethtool ioctl error on tap4a100b7a-c7: No such device Dec 6 05:23:47 localhost nova_compute[237281]: 2025-12-06 10:23:47.887 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:47 localhost nova_compute[237281]: 2025-12-06 10:23:47.916 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:48 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:48.066 219384 INFO neutron.agent.linux.ip_lib [None req-c874e828-0400-4ef8-acd0-f53c807ca8c4 - - - - - -] Device tapa9af25e7-bc cannot be used as it has no MAC address#033[00m Dec 6 05:23:48 localhost nova_compute[237281]: 2025-12-06 10:23:48.101 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:48 localhost kernel: device tapa9af25e7-bc entered promiscuous mode Dec 6 05:23:48 localhost systemd-udevd[258614]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:23:48 localhost NetworkManager[5965]: [1765016628.1082] manager: (tapa9af25e7-bc): new Generic device (/org/freedesktop/NetworkManager/Devices/58) Dec 6 05:23:48 localhost ovn_controller[131684]: 2025-12-06T10:23:48Z|00336|binding|INFO|Claiming lport a9af25e7-bca3-4a81-96c6-a856afcfe29d for this chassis. Dec 6 05:23:48 localhost ovn_controller[131684]: 2025-12-06T10:23:48Z|00337|binding|INFO|a9af25e7-bca3-4a81-96c6-a856afcfe29d: Claiming unknown Dec 6 05:23:48 localhost nova_compute[237281]: 2025-12-06 10:23:48.122 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:48.125 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-1784874d-a6fb-4581-8d90-fd90e82d3846', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1784874d-a6fb-4581-8d90-fd90e82d3846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b415850c5b2b4306b771adda1881b474', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c320c59-9905-4a73-a0b6-a7e1cb374570, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a9af25e7-bca3-4a81-96c6-a856afcfe29d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:48.127 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a9af25e7-bca3-4a81-96c6-a856afcfe29d in datapath 1784874d-a6fb-4581-8d90-fd90e82d3846 bound to our chassis#033[00m Dec 6 05:23:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:48.129 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1784874d-a6fb-4581-8d90-fd90e82d3846 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:23:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:48.130 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8a096a-2082-4d18-ae46-077a752893e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:48 localhost ovn_controller[131684]: 2025-12-06T10:23:48Z|00338|binding|INFO|Setting lport a9af25e7-bca3-4a81-96c6-a856afcfe29d ovn-installed in OVS Dec 6 05:23:48 localhost ovn_controller[131684]: 2025-12-06T10:23:48Z|00339|binding|INFO|Setting lport a9af25e7-bca3-4a81-96c6-a856afcfe29d up in Southbound Dec 6 05:23:48 localhost nova_compute[237281]: 2025-12-06 10:23:48.158 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:48 localhost nova_compute[237281]: 2025-12-06 10:23:48.202 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:48 localhost nova_compute[237281]: 2025-12-06 10:23:48.238 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:48 localhost systemd[1]: var-lib-containers-storage-overlay-f3b3ad3c97857eb25e818c9322da876b09870d00e7652afd7593b28f5b597bbf-merged.mount: Deactivated successfully. Dec 6 05:23:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-853ac23cab1d8b41bcaa90f537bc7fb78ec20f120b83fcdea908b11c543619d5-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:48 localhost podman[258723]: Dec 6 05:23:48 localhost podman[258723]: 2025-12-06 10:23:48.835038531 +0000 UTC m=+0.086242347 container create 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:23:48 localhost systemd[1]: Started libpod-conmon-0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073.scope. Dec 6 05:23:48 localhost systemd[1]: Started libcrun container. Dec 6 05:23:48 localhost podman[258723]: 2025-12-06 10:23:48.795419371 +0000 UTC m=+0.046623227 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fadfabeb59e5e6129cfa5a7af30b29e108406a54bf96096fcc8651b2eaa72a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:48 localhost podman[258723]: 2025-12-06 10:23:48.914970263 +0000 UTC m=+0.166174079 container init 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:23:48 localhost dnsmasq[258741]: started, version 2.85 cachesize 150 Dec 6 05:23:48 localhost dnsmasq[258741]: DNS service limited to local subnets Dec 6 05:23:48 localhost dnsmasq[258741]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:48 localhost dnsmasq[258741]: warning: no upstream servers configured Dec 6 05:23:48 localhost dnsmasq-dhcp[258741]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 6 05:23:48 localhost dnsmasq-dhcp[258741]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:23:48 localhost dnsmasq[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:48 localhost dnsmasq-dhcp[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:48 localhost dnsmasq-dhcp[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:48 localhost podman[258723]: 2025-12-06 10:23:48.931526913 +0000 UTC m=+0.182730699 container start 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:49.079 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:49.098 219384 INFO neutron.agent.dhcp.agent [None req-cdada148-a209-40fe-aaa6-c7cef0f3e00a - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '4a100b7a-c73f-4755-9e86-cb876d010696'} is completed#033[00m Dec 6 05:23:49 localhost podman[258764]: Dec 6 05:23:49 localhost podman[258764]: 2025-12-06 10:23:49.151670253 +0000 UTC m=+0.096406081 container create a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1784874d-a6fb-4581-8d90-fd90e82d3846, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:23:49 localhost systemd[1]: Started libpod-conmon-a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5.scope. Dec 6 05:23:49 localhost nova_compute[237281]: 2025-12-06 10:23:49.187 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:49 localhost systemd[1]: Started libcrun container. Dec 6 05:23:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db9d0d73d56c52a1c8b6c6c87550b489bb7ebf51de8a1078ddbcd6d5ddaaea16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:49 localhost podman[258764]: 2025-12-06 10:23:49.105777599 +0000 UTC m=+0.050513447 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:49 localhost podman[258764]: 2025-12-06 10:23:49.207617036 +0000 UTC m=+0.152352854 container init a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1784874d-a6fb-4581-8d90-fd90e82d3846, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:23:49 localhost podman[258764]: 2025-12-06 10:23:49.217263403 +0000 UTC m=+0.161999221 container start a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1784874d-a6fb-4581-8d90-fd90e82d3846, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 05:23:49 localhost dnsmasq[258782]: started, version 2.85 cachesize 150 Dec 6 05:23:49 localhost dnsmasq[258782]: DNS service limited to local subnets Dec 6 05:23:49 localhost dnsmasq[258782]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:49 localhost dnsmasq[258782]: warning: no upstream servers configured Dec 6 05:23:49 localhost dnsmasq-dhcp[258782]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:23:49 localhost dnsmasq[258782]: read /var/lib/neutron/dhcp/1784874d-a6fb-4581-8d90-fd90e82d3846/addn_hosts - 0 addresses Dec 6 05:23:49 localhost dnsmasq-dhcp[258782]: read /var/lib/neutron/dhcp/1784874d-a6fb-4581-8d90-fd90e82d3846/host Dec 6 05:23:49 localhost dnsmasq-dhcp[258782]: read /var/lib/neutron/dhcp/1784874d-a6fb-4581-8d90-fd90e82d3846/opts Dec 6 05:23:49 localhost nova_compute[237281]: 2025-12-06 10:23:49.254 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:49 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:49.263 2 INFO neutron.agent.securitygroups_rpc [None req-0317c2a0-a8b8-488e-b63f-04fb8d2bea46 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:23:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:49.335 219384 INFO neutron.agent.dhcp.agent [None req-a09679d2-0738-4754-8108-38ed7ec66b99 - - - - - -] DHCP configuration for ports {'79d5e094-d190-43ed-855e-c21827510f52'} is completed#033[00m Dec 6 05:23:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:49.635 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:23:48Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=4efe3ea8-5944-4882-9f81-9fb702b2d64c, ip_allocation=immediate, mac_address=fa:16:3e:95:92:83, name=tempest-NetworksTestDHCPv6-558714043, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:19:31Z, description=, dns_domain=, id=f47279f6-9d96-4d9c-849b-5ff8c250556a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1701825281, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9215, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=1001, status=ACTIVE, subnets=['2389e9c9-651e-4dbb-b6e0-e06ed7e5bf09', 'f63c3bdc-b6d6-4613-8556-f276fc2f4bce'], tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:23:41Z, vlan_transparent=None, network_id=f47279f6-9d96-4d9c-849b-5ff8c250556a, port_security_enabled=True, project_id=ad46cd44bc684d308b115c07348c3812, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['90871f8c-e70b-44bc-8329-ed8f09a25812'], standard_attr_id=1809, status=DOWN, tags=[], tenant_id=ad46cd44bc684d308b115c07348c3812, updated_at=2025-12-06T10:23:48Z on network f47279f6-9d96-4d9c-849b-5ff8c250556a#033[00m Dec 6 05:23:49 localhost dnsmasq[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 2 addresses Dec 6 05:23:49 localhost dnsmasq-dhcp[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:49 localhost dnsmasq-dhcp[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:49 localhost podman[258801]: 2025-12-06 10:23:49.843880332 +0000 UTC m=+0.060804684 container kill 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:23:49 localhost nova_compute[237281]: 2025-12-06 10:23:49.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:50 localhost nova_compute[237281]: 2025-12-06 10:23:50.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:51 localhost sshd[258822]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:23:51 localhost nova_compute[237281]: 2025-12-06 10:23:51.253 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:23:51 localhost podman[258823]: 2025-12-06 10:23:51.558590404 +0000 UTC m=+0.083460972 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public) Dec 6 05:23:51 localhost podman[258823]: 2025-12-06 10:23:51.575577247 +0000 UTC m=+0.100447845 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible) Dec 6 05:23:51 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:23:51 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:51.868 219384 INFO neutron.agent.dhcp.agent [None req-0a0881fb-e942-469b-9e58-63106f643126 - - - - - -] DHCP configuration for ports {'4efe3ea8-5944-4882-9f81-9fb702b2d64c'} is completed#033[00m Dec 6 05:23:52 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:52.723 2 INFO neutron.agent.securitygroups_rpc [None req-002353b5-1968-4ef0-b3a2-41a8a37e06f6 a9f1c1c90a244918a5157305c0db7b7c ad46cd44bc684d308b115c07348c3812 - - default default] Security group member updated ['90871f8c-e70b-44bc-8329-ed8f09a25812']#033[00m Dec 6 05:23:52 localhost nova_compute[237281]: 2025-12-06 10:23:52.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:52 localhost nova_compute[237281]: 2025-12-06 10:23:52.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:53 localhost dnsmasq[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:53 localhost dnsmasq-dhcp[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:53 localhost dnsmasq-dhcp[258741]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:53 localhost podman[258858]: 2025-12-06 10:23:53.004224358 +0000 UTC m=+0.048440032 container kill 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:53 localhost podman[197801]: time="2025-12-06T10:23:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:23:53 localhost podman[197801]: @ - - [06/Dec/2025:10:23:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147886 "" "Go-http-client/1.1" Dec 6 05:23:53 localhost podman[197801]: @ - - [06/Dec/2025:10:23:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16911 "" "Go-http-client/1.1" Dec 6 05:23:53 localhost nova_compute[237281]: 2025-12-06 10:23:53.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:53 localhost nova_compute[237281]: 2025-12-06 10:23:53.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:23:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44826 DF PROTO=TCP SPT=50190 DPT=9102 SEQ=1098211388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDEC29A0000000001030307) Dec 6 05:23:54 localhost nova_compute[237281]: 2025-12-06 10:23:54.221 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:54 localhost dnsmasq[258741]: exiting on receipt of SIGTERM Dec 6 05:23:54 localhost podman[258897]: 2025-12-06 10:23:54.904228316 +0000 UTC m=+0.054522349 container kill 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:23:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:23:54 localhost systemd[1]: libpod-0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073.scope: Deactivated successfully. Dec 6 05:23:54 localhost podman[258912]: 2025-12-06 10:23:54.979074242 +0000 UTC m=+0.054589352 container died 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Dec 6 05:23:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44827 DF PROTO=TCP SPT=50190 DPT=9102 SEQ=1098211388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDEC6870000000001030307) Dec 6 05:23:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:55 localhost podman[258925]: 2025-12-06 10:23:55.060049226 +0000 UTC m=+0.127216139 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:23:55 localhost podman[258912]: 2025-12-06 10:23:55.079192676 +0000 UTC m=+0.154707736 container remove 0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:23:55 localhost systemd[1]: libpod-conmon-0db7ec05d229181a381ed10b15f3f0b13513b83b6b6a81990c52ac20f5968073.scope: Deactivated successfully. Dec 6 05:23:55 localhost podman[258925]: 2025-12-06 10:23:55.142034501 +0000 UTC m=+0.209201464 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:23:55 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:23:55 localhost dnsmasq[258782]: exiting on receipt of SIGTERM Dec 6 05:23:55 localhost systemd[1]: libpod-a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5.scope: Deactivated successfully. Dec 6 05:23:55 localhost podman[258963]: 2025-12-06 10:23:55.181032302 +0000 UTC m=+0.150011301 container kill a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1784874d-a6fb-4581-8d90-fd90e82d3846, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:23:55 localhost podman[258997]: 2025-12-06 10:23:55.247917382 +0000 UTC m=+0.055740918 container died a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1784874d-a6fb-4581-8d90-fd90e82d3846, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:23:55 localhost podman[258997]: 2025-12-06 10:23:55.322028654 +0000 UTC m=+0.129852180 container cleanup a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1784874d-a6fb-4581-8d90-fd90e82d3846, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:23:55 localhost systemd[1]: libpod-conmon-a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5.scope: Deactivated successfully. Dec 6 05:23:55 localhost podman[259005]: 2025-12-06 10:23:55.346869599 +0000 UTC m=+0.136147213 container remove a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1784874d-a6fb-4581-8d90-fd90e82d3846, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:23:55 localhost kernel: device tapa9af25e7-bc left promiscuous mode Dec 6 05:23:55 localhost ovn_controller[131684]: 2025-12-06T10:23:55Z|00340|binding|INFO|Releasing lport a9af25e7-bca3-4a81-96c6-a856afcfe29d from this chassis (sb_readonly=0) Dec 6 05:23:55 localhost ovn_controller[131684]: 2025-12-06T10:23:55Z|00341|binding|INFO|Setting lport a9af25e7-bca3-4a81-96c6-a856afcfe29d down in Southbound Dec 6 05:23:55 localhost nova_compute[237281]: 2025-12-06 10:23:55.421 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.433 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-1784874d-a6fb-4581-8d90-fd90e82d3846', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1784874d-a6fb-4581-8d90-fd90e82d3846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b415850c5b2b4306b771adda1881b474', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c320c59-9905-4a73-a0b6-a7e1cb374570, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a9af25e7-bca3-4a81-96c6-a856afcfe29d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.436 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a9af25e7-bca3-4a81-96c6-a856afcfe29d in datapath 1784874d-a6fb-4581-8d90-fd90e82d3846 unbound from our chassis#033[00m Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.439 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1784874d-a6fb-4581-8d90-fd90e82d3846, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:55 localhost nova_compute[237281]: 2025-12-06 10:23:55.440 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.440 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[bd924e4c-5a17-4de2-8542-c6363fb25325]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9828 DF PROTO=TCP SPT=36902 DPT=9102 SEQ=3511509215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDEC9870000000001030307) Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.771 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:b8:5f 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-0129ddbe-a70a-46c5-8e58-a8bc1b104b46', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0129ddbe-a70a-46c5-8e58-a8bc1b104b46', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e49ec58-ad31-4618-9f9d-d6ae6b615b1f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d196240a-f2f8-4a6d-bc21-e5c717b6807a) old=Port_Binding(mac=['fa:16:3e:4c:b8:5f 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-0129ddbe-a70a-46c5-8e58-a8bc1b104b46', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0129ddbe-a70a-46c5-8e58-a8bc1b104b46', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.774 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d196240a-f2f8-4a6d-bc21-e5c717b6807a in datapath 0129ddbe-a70a-46c5-8e58-a8bc1b104b46 updated#033[00m Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.776 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0129ddbe-a70a-46c5-8e58-a8bc1b104b46, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:55.777 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[49faedef-e1cf-4c37-adb6-d6faa1b2dc36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:55 localhost systemd[1]: var-lib-containers-storage-overlay-db9d0d73d56c52a1c8b6c6c87550b489bb7ebf51de8a1078ddbcd6d5ddaaea16-merged.mount: Deactivated successfully. Dec 6 05:23:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0655d993a706c3ddd137a230f7132470b5c466b4b10e77c8de4883266a8b0a5-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:55 localhost systemd[1]: var-lib-containers-storage-overlay-1fadfabeb59e5e6129cfa5a7af30b29e108406a54bf96096fcc8651b2eaa72a3-merged.mount: Deactivated successfully. Dec 6 05:23:55 localhost podman[259068]: Dec 6 05:23:55 localhost podman[259068]: 2025-12-06 10:23:55.915713909 +0000 UTC m=+0.088829916 container create ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:55 localhost systemd[1]: Started libpod-conmon-ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709.scope. Dec 6 05:23:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:55.971 219384 INFO neutron.agent.dhcp.agent [None req-6c9ee211-c012-486a-a112-bab58890bcc4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:55 localhost systemd[1]: Started libcrun container. Dec 6 05:23:55 localhost podman[259068]: 2025-12-06 10:23:55.873292413 +0000 UTC m=+0.046408430 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77a15da1b4304df3087f0a026b236e35ea662f7b8a4f84e6047ce2647359ae78/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:55 localhost podman[259068]: 2025-12-06 10:23:55.988318935 +0000 UTC m=+0.161434942 container init ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:23:55 localhost podman[259068]: 2025-12-06 10:23:55.996944731 +0000 UTC m=+0.170060728 container start ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:56 localhost dnsmasq[259086]: started, version 2.85 cachesize 150 Dec 6 05:23:56 localhost dnsmasq[259086]: DNS service limited to local subnets Dec 6 05:23:56 localhost dnsmasq[259086]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:56 localhost dnsmasq[259086]: warning: no upstream servers configured Dec 6 05:23:56 localhost dnsmasq-dhcp[259086]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 6 05:23:56 localhost dnsmasq[259086]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/addn_hosts - 0 addresses Dec 6 05:23:56 localhost dnsmasq-dhcp[259086]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/host Dec 6 05:23:56 localhost dnsmasq-dhcp[259086]: read /var/lib/neutron/dhcp/f47279f6-9d96-4d9c-849b-5ff8c250556a/opts Dec 6 05:23:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:56.243 219384 INFO neutron.agent.dhcp.agent [None req-918ea8c2-fccc-4d57-a4be-47b8bd784e34 - - - - - -] DHCP configuration for ports {'3ba22961-ba53-4fab-b867-7a59008889f5', '4a100b7a-c73f-4755-9e86-cb876d010696'} is completed#033[00m Dec 6 05:23:56 localhost nova_compute[237281]: 2025-12-06 10:23:56.256 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:56 localhost dnsmasq[259086]: exiting on receipt of SIGTERM Dec 6 05:23:56 localhost podman[259104]: 2025-12-06 10:23:56.308439556 +0000 UTC m=+0.058645608 container kill ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:56 localhost systemd[1]: libpod-ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709.scope: Deactivated successfully. Dec 6 05:23:56 localhost podman[259116]: 2025-12-06 10:23:56.352297726 +0000 UTC m=+0.036896877 container died ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:23:56 localhost podman[259116]: 2025-12-06 10:23:56.432783575 +0000 UTC m=+0.117382686 container cleanup ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:23:56 localhost systemd[1]: libpod-conmon-ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709.scope: Deactivated successfully. Dec 6 05:23:56 localhost podman[259123]: 2025-12-06 10:23:56.458562129 +0000 UTC m=+0.129243642 container remove ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f47279f6-9d96-4d9c-849b-5ff8c250556a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:23:56 localhost kernel: device tap4a100b7a-c7 left promiscuous mode Dec 6 05:23:56 localhost ovn_controller[131684]: 2025-12-06T10:23:56Z|00342|binding|INFO|Releasing lport 4a100b7a-c73f-4755-9e86-cb876d010696 from this chassis (sb_readonly=0) Dec 6 05:23:56 localhost ovn_controller[131684]: 2025-12-06T10:23:56Z|00343|binding|INFO|Setting lport 4a100b7a-c73f-4755-9e86-cb876d010696 down in Southbound Dec 6 05:23:56 localhost nova_compute[237281]: 2025-12-06 10:23:56.498 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:56.510 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe11:631c/64 2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f47279f6-9d96-4d9c-849b-5ff8c250556a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad46cd44bc684d308b115c07348c3812', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7db54be-b5f5-4e03-a1fb-e0c5d4551fdd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4a100b7a-c73f-4755-9e86-cb876d010696) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:56.512 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 4a100b7a-c73f-4755-9e86-cb876d010696 in datapath f47279f6-9d96-4d9c-849b-5ff8c250556a unbound from our chassis#033[00m Dec 6 05:23:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:56.515 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f47279f6-9d96-4d9c-849b-5ff8c250556a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:23:56 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:56.516 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d70d0b8f-3f1a-4c8c-8796-9dab172ea858]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:56 localhost nova_compute[237281]: 2025-12-06 10:23:56.519 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:56.865 219384 INFO neutron.agent.dhcp.agent [None req-9ad7eff5-f48c-48d5-a3a9-d6b734ffb4aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:56 localhost systemd[1]: var-lib-containers-storage-overlay-77a15da1b4304df3087f0a026b236e35ea662f7b8a4f84e6047ce2647359ae78-merged.mount: Deactivated successfully. Dec 6 05:23:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee3aecbf7b0a1e4fa55ece0ee45eab1a6d4ad4b1cd18f5866a4e66753b8b6709-userdata-shm.mount: Deactivated successfully. Dec 6 05:23:56 localhost systemd[1]: run-netns-qdhcp\x2d1784874d\x2da6fb\x2d4581\x2d8d90\x2dfd90e82d3846.mount: Deactivated successfully. Dec 6 05:23:56 localhost systemd[1]: run-netns-qdhcp\x2df47279f6\x2d9d96\x2d4d9c\x2d849b\x2d5ff8c250556a.mount: Deactivated successfully. Dec 6 05:23:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44828 DF PROTO=TCP SPT=50190 DPT=9102 SEQ=1098211388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDECE870000000001030307) Dec 6 05:23:57 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:57.079 2 INFO neutron.agent.securitygroups_rpc [None req-300adbf5-7c0f-44dd-89bf-41d823bb09cb 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:57 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:57.527 2 INFO neutron.agent.securitygroups_rpc [None req-2422b818-feeb-428b-8827-8071ad93c045 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['057036cd-6c4c-4a5a-859d-53c881e393cb']#033[00m Dec 6 05:23:57 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:57.619 2 INFO neutron.agent.securitygroups_rpc [None req-db614ca7-96bb-485d-8b26-f6d7930d7010 f6b40925fbba400f81c8b89fb799fe96 194f7052cbfc400785c7e06ff96c77a0 - - default default] Security group member updated ['7f6362d9-8c39-4da3-b055-3ab3f8924849']#033[00m Dec 6 05:23:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6149 DF PROTO=TCP SPT=50680 DPT=9102 SEQ=3014205041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDED1870000000001030307) Dec 6 05:23:57 localhost nova_compute[237281]: 2025-12-06 10:23:57.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:23:57 localhost nova_compute[237281]: 2025-12-06 10:23:57.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:23:57 localhost nova_compute[237281]: 2025-12-06 10:23:57.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:23:57 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:57.904 2 INFO neutron.agent.securitygroups_rpc [None req-db614ca7-96bb-485d-8b26-f6d7930d7010 f6b40925fbba400f81c8b89fb799fe96 194f7052cbfc400785c7e06ff96c77a0 - - default default] Security group member updated ['7f6362d9-8c39-4da3-b055-3ab3f8924849']#033[00m Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.309 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.309 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.310 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.310 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:23:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:58.583 219384 INFO neutron.agent.linux.ip_lib [None req-c062de59-d71e-4a8d-8ee2-e79645def4ea - - - - - -] Device tap86e7098a-89 cannot be used as it has no MAC address#033[00m Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.606 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:58 localhost kernel: device tap86e7098a-89 entered promiscuous mode Dec 6 05:23:58 localhost NetworkManager[5965]: [1765016638.6151] manager: (tap86e7098a-89): new Generic device (/org/freedesktop/NetworkManager/Devices/59) Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.615 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:58 localhost ovn_controller[131684]: 2025-12-06T10:23:58Z|00344|binding|INFO|Claiming lport 86e7098a-8913-455b-92b1-8787e15b4614 for this chassis. Dec 6 05:23:58 localhost ovn_controller[131684]: 2025-12-06T10:23:58Z|00345|binding|INFO|86e7098a-8913-455b-92b1-8787e15b4614: Claiming unknown Dec 6 05:23:58 localhost systemd-udevd[259156]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:23:58 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:58.637 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-74c85386-9889-4f05-8ee1-7bc7c702326e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74c85386-9889-4f05-8ee1-7bc7c702326e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f53d29120b44434860c4dafb30d2afc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=838dad69-b153-4634-a903-eaada7d7209e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=86e7098a-8913-455b-92b1-8787e15b4614) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:23:58 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:58.639 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 86e7098a-8913-455b-92b1-8787e15b4614 in datapath 74c85386-9889-4f05-8ee1-7bc7c702326e bound to our chassis#033[00m Dec 6 05:23:58 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:58.641 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 74c85386-9889-4f05-8ee1-7bc7c702326e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:23:58 localhost ovn_metadata_agent[137254]: 2025-12-06 10:23:58.642 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[10dfa216-5faf-44c2-9c32-4058f1ab5b74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost ovn_controller[131684]: 2025-12-06T10:23:58Z|00346|binding|INFO|Setting lport 86e7098a-8913-455b-92b1-8787e15b4614 ovn-installed in OVS Dec 6 05:23:58 localhost ovn_controller[131684]: 2025-12-06T10:23:58Z|00347|binding|INFO|Setting lport 86e7098a-8913-455b-92b1-8787e15b4614 up in Southbound Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.666 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:58.676 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost journal[186952]: ethtool ioctl error on tap86e7098a-89: No such device Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.699 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:58 localhost nova_compute[237281]: 2025-12-06 10:23:58.731 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:59 localhost nova_compute[237281]: 2025-12-06 10:23:59.223 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:59 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:59.378 2 INFO neutron.agent.securitygroups_rpc [None req-83833c07-5f37-4177-8c11-549de0ec8740 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:23:59 localhost ovn_controller[131684]: 2025-12-06T10:23:59Z|00348|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:23:59 localhost nova_compute[237281]: 2025-12-06 10:23:59.512 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:23:59 localhost podman[259227]: Dec 6 05:23:59 localhost podman[259227]: 2025-12-06 10:23:59.569651057 +0000 UTC m=+0.129202700 container create f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74c85386-9889-4f05-8ee1-7bc7c702326e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:23:59 localhost systemd[1]: Started libpod-conmon-f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f.scope. Dec 6 05:23:59 localhost podman[259227]: 2025-12-06 10:23:59.525211978 +0000 UTC m=+0.084763621 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:23:59 localhost systemd[1]: tmp-crun.eSYvyE.mount: Deactivated successfully. Dec 6 05:23:59 localhost systemd[1]: Started libcrun container. Dec 6 05:23:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39304fa0c7f6a0227c4e5f6ec3b2f8d33274bf967834989817b323cb7c1ef7d8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:23:59 localhost podman[259227]: 2025-12-06 10:23:59.662469915 +0000 UTC m=+0.222021558 container init f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74c85386-9889-4f05-8ee1-7bc7c702326e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:23:59 localhost podman[259227]: 2025-12-06 10:23:59.671193034 +0000 UTC m=+0.230744667 container start f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74c85386-9889-4f05-8ee1-7bc7c702326e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:23:59 localhost dnsmasq[259243]: started, version 2.85 cachesize 150 Dec 6 05:23:59 localhost dnsmasq[259243]: DNS service limited to local subnets Dec 6 05:23:59 localhost dnsmasq[259243]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:23:59 localhost dnsmasq[259243]: warning: no upstream servers configured Dec 6 05:23:59 localhost dnsmasq-dhcp[259243]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Dec 6 05:23:59 localhost dnsmasq[259243]: read /var/lib/neutron/dhcp/74c85386-9889-4f05-8ee1-7bc7c702326e/addn_hosts - 0 addresses Dec 6 05:23:59 localhost dnsmasq-dhcp[259243]: read /var/lib/neutron/dhcp/74c85386-9889-4f05-8ee1-7bc7c702326e/host Dec 6 05:23:59 localhost dnsmasq-dhcp[259243]: read /var/lib/neutron/dhcp/74c85386-9889-4f05-8ee1-7bc7c702326e/opts Dec 6 05:23:59 localhost neutron_sriov_agent[212548]: 2025-12-06 10:23:59.749 2 INFO neutron.agent.securitygroups_rpc [None req-b774b76a-2dd9-40d7-811f-0f622a4f49b2 f6b40925fbba400f81c8b89fb799fe96 194f7052cbfc400785c7e06ff96c77a0 - - default default] Security group member updated ['7f6362d9-8c39-4da3-b055-3ab3f8924849']#033[00m Dec 6 05:23:59 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:23:59.839 219384 INFO neutron.agent.dhcp.agent [None req-95fb31ff-df5b-437e-8a29-c90e8f373e83 - - - - - -] DHCP configuration for ports {'f64f9505-274b-47f5-ab6a-164c57a62469'} is completed#033[00m Dec 6 05:24:00 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:00.145 2 INFO neutron.agent.securitygroups_rpc [None req-f39f0b92-aa25-48ff-b63c-a15b43d82f27 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['057036cd-6c4c-4a5a-859d-53c881e393cb', 'e3c23737-d81c-4d73-854d-cc270c6033db']#033[00m Dec 6 05:24:00 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:00.232 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.481 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.507 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.508 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.508 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.509 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.536 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.537 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.537 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.538 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.627 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.700 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.702 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.753 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.755 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:24:00 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:00.822 2 INFO neutron.agent.securitygroups_rpc [None req-a255cd6c-f45c-4a62-95b7-c3692a1371ce f6b40925fbba400f81c8b89fb799fe96 194f7052cbfc400785c7e06ff96c77a0 - - default default] Security group member updated ['7f6362d9-8c39-4da3-b055-3ab3f8924849']#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.828 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.830 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:24:00 localhost nova_compute[237281]: 2025-12-06 10:24:00.885 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:24:01 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:01.070 2 INFO neutron.agent.securitygroups_rpc [None req-1c62d591-98f5-4be6-9e8a-f3e41186d845 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:24:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44829 DF PROTO=TCP SPT=50190 DPT=9102 SEQ=1098211388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDEDE470000000001030307) Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.095 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.096 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12309MB free_disk=387.2663154602051GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.096 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.097 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:24:01 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:01.146 2 INFO neutron.agent.securitygroups_rpc [None req-d7b4254e-787d-4891-9313-cbe2b9f166de 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['e3c23737-d81c-4d73-854d-cc270c6033db']#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.188 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.188 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.189 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.254 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.273 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.275 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.276 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:24:01 localhost nova_compute[237281]: 2025-12-06 10:24:01.282 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:01 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:01.869 2 INFO neutron.agent.securitygroups_rpc [None req-cc5c7d0e-838d-479e-a945-165febbf439d 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:24:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:24:03 localhost podman[259256]: 2025-12-06 10:24:03.563558144 +0000 UTC m=+0.089524398 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true) Dec 6 05:24:03 localhost podman[259256]: 2025-12-06 10:24:03.638623356 +0000 UTC m=+0.164589610 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:24:03 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:24:04 localhost nova_compute[237281]: 2025-12-06 10:24:04.225 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:06 localhost nova_compute[237281]: 2025-12-06 10:24:06.285 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:06.708 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:24:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:06.710 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:24:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:06.712 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:24:08 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:08.361 2 INFO neutron.agent.securitygroups_rpc [None req-0f6b93cc-739a-4397-af09-0708bff78765 465fd34524684efe9a64c7cf654d42cb 95b9dfea05f443a089d4ffce2f7daab3 - - default default] Security group rule updated ['c634da46-45b3-49be-9a48-bc7d77faf782']#033[00m Dec 6 05:24:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:24:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:24:08 localhost systemd[1]: tmp-crun.co4jMg.mount: Deactivated successfully. Dec 6 05:24:08 localhost podman[259282]: 2025-12-06 10:24:08.549583951 +0000 UTC m=+0.079384796 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:24:08 localhost podman[259282]: 2025-12-06 10:24:08.585667592 +0000 UTC m=+0.115468447 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:24:08 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:24:08 localhost podman[259283]: 2025-12-06 10:24:08.603018827 +0000 UTC m=+0.129743147 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd) Dec 6 05:24:08 localhost podman[259283]: 2025-12-06 10:24:08.618288437 +0000 UTC m=+0.145012757 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125) Dec 6 05:24:08 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:24:09 localhost nova_compute[237281]: 2025-12-06 10:24:09.226 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:09 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:09.235 2 INFO neutron.agent.securitygroups_rpc [None req-fb36c9f9-aada-483e-b224-953fce979609 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['341b0813-2c64-49b5-8bec-9f1bb31bad43']#033[00m Dec 6 05:24:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44830 DF PROTO=TCP SPT=50190 DPT=9102 SEQ=1098211388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDEFF870000000001030307) Dec 6 05:24:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:10.906 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:24:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:10.974 219384 INFO neutron.agent.linux.ip_lib [None req-ab5c0f9a-806b-46be-bb93-f04e7617cb15 - - - - - -] Device tap7a5374c4-1d cannot be used as it has no MAC address#033[00m Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.050 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:11 localhost kernel: device tap7a5374c4-1d entered promiscuous mode Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.058 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:11 localhost ovn_controller[131684]: 2025-12-06T10:24:11Z|00349|binding|INFO|Claiming lport 7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb for this chassis. Dec 6 05:24:11 localhost ovn_controller[131684]: 2025-12-06T10:24:11Z|00350|binding|INFO|7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb: Claiming unknown Dec 6 05:24:11 localhost NetworkManager[5965]: [1765016651.0604] manager: (tap7a5374c4-1d): new Generic device (/org/freedesktop/NetworkManager/Devices/60) Dec 6 05:24:11 localhost systemd-udevd[259334]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:24:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:11.079 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-8ab7de36-dad6-4585-818f-9c4eaf4709d0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ab7de36-dad6-4585-818f-9c4eaf4709d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '194f7052cbfc400785c7e06ff96c77a0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=443c7c56-e115-475e-9e51-b42b439a1bba, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:11.082 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb in datapath 8ab7de36-dad6-4585-818f-9c4eaf4709d0 bound to our chassis#033[00m Dec 6 05:24:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:11.084 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8ab7de36-dad6-4585-818f-9c4eaf4709d0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:24:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:11.085 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[2e87e98e-4492-4509-9bd2-c7bcd1195344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost ovn_controller[131684]: 2025-12-06T10:24:11Z|00351|binding|INFO|Setting lport 7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb ovn-installed in OVS Dec 6 05:24:11 localhost ovn_controller[131684]: 2025-12-06T10:24:11Z|00352|binding|INFO|Setting lport 7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb up in Southbound Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.113 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost journal[186952]: ethtool ioctl error on tap7a5374c4-1d: No such device Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.148 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.184 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.270 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.289 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:11 localhost nova_compute[237281]: 2025-12-06 10:24:11.665 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:11.669 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:11 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:11.671 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:24:12 localhost podman[259405]: Dec 6 05:24:12 localhost podman[259405]: 2025-12-06 10:24:12.159447712 +0000 UTC m=+0.100065213 container create d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ab7de36-dad6-4585-818f-9c4eaf4709d0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:24:12 localhost podman[259405]: 2025-12-06 10:24:12.108157523 +0000 UTC m=+0.048775074 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:24:12 localhost systemd[1]: Started libpod-conmon-d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe.scope. Dec 6 05:24:12 localhost systemd[1]: Started libcrun container. Dec 6 05:24:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b764d84a17e63cb527ee485ce650e18010b7f5243457b2905eb41baa01435c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:24:12 localhost podman[259405]: 2025-12-06 10:24:12.260128413 +0000 UTC m=+0.200745914 container init d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ab7de36-dad6-4585-818f-9c4eaf4709d0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:24:12 localhost podman[259405]: 2025-12-06 10:24:12.269759409 +0000 UTC m=+0.210376940 container start d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ab7de36-dad6-4585-818f-9c4eaf4709d0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:24:12 localhost dnsmasq[259423]: started, version 2.85 cachesize 150 Dec 6 05:24:12 localhost dnsmasq[259423]: DNS service limited to local subnets Dec 6 05:24:12 localhost dnsmasq[259423]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:24:12 localhost dnsmasq[259423]: warning: no upstream servers configured Dec 6 05:24:12 localhost dnsmasq-dhcp[259423]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:24:12 localhost dnsmasq[259423]: read /var/lib/neutron/dhcp/8ab7de36-dad6-4585-818f-9c4eaf4709d0/addn_hosts - 0 addresses Dec 6 05:24:12 localhost dnsmasq-dhcp[259423]: read /var/lib/neutron/dhcp/8ab7de36-dad6-4585-818f-9c4eaf4709d0/host Dec 6 05:24:12 localhost dnsmasq-dhcp[259423]: read /var/lib/neutron/dhcp/8ab7de36-dad6-4585-818f-9c4eaf4709d0/opts Dec 6 05:24:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:12.673 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:24:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:24:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:24:13 localhost podman[259424]: 2025-12-06 10:24:13.067278302 +0000 UTC m=+0.093106779 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:24:13 localhost podman[259424]: 2025-12-06 10:24:13.072477562 +0000 UTC m=+0.098305969 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:24:13 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:24:13 localhost podman[259425]: 2025-12-06 10:24:13.178958861 +0000 UTC m=+0.203062835 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 05:24:13 localhost podman[259425]: 2025-12-06 10:24:13.196546663 +0000 UTC m=+0.220650657 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=edpm) Dec 6 05:24:13 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:24:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:13.229 219384 INFO neutron.agent.dhcp.agent [None req-18716199-2032-4575-b7a5-8aa27d194c89 - - - - - -] DHCP configuration for ports {'0d8d4be1-fb7a-4d5e-a3da-f1294e730932'} is completed#033[00m Dec 6 05:24:13 localhost dnsmasq[259423]: exiting on receipt of SIGTERM Dec 6 05:24:13 localhost podman[259479]: 2025-12-06 10:24:13.433975446 +0000 UTC m=+0.056096239 container kill d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ab7de36-dad6-4585-818f-9c4eaf4709d0, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:24:13 localhost systemd[1]: libpod-d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe.scope: Deactivated successfully. Dec 6 05:24:13 localhost podman[259493]: 2025-12-06 10:24:13.496695507 +0000 UTC m=+0.043011145 container died d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ab7de36-dad6-4585-818f-9c4eaf4709d0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:24:13 localhost podman[259493]: 2025-12-06 10:24:13.5396429 +0000 UTC m=+0.085958508 container remove d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8ab7de36-dad6-4585-818f-9c4eaf4709d0, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:24:13 localhost nova_compute[237281]: 2025-12-06 10:24:13.550 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:13 localhost kernel: device tap7a5374c4-1d left promiscuous mode Dec 6 05:24:13 localhost ovn_controller[131684]: 2025-12-06T10:24:13Z|00353|binding|INFO|Releasing lport 7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb from this chassis (sb_readonly=0) Dec 6 05:24:13 localhost ovn_controller[131684]: 2025-12-06T10:24:13Z|00354|binding|INFO|Setting lport 7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb down in Southbound Dec 6 05:24:13 localhost systemd[1]: libpod-conmon-d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe.scope: Deactivated successfully. Dec 6 05:24:13 localhost nova_compute[237281]: 2025-12-06 10:24:13.571 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:13.770 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-8ab7de36-dad6-4585-818f-9c4eaf4709d0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ab7de36-dad6-4585-818f-9c4eaf4709d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '194f7052cbfc400785c7e06ff96c77a0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=443c7c56-e115-475e-9e51-b42b439a1bba, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:13.773 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 7a5374c4-1db9-4b4e-a4bf-6657f3d97cbb in datapath 8ab7de36-dad6-4585-818f-9c4eaf4709d0 unbound from our chassis#033[00m Dec 6 05:24:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:13.775 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8ab7de36-dad6-4585-818f-9c4eaf4709d0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:24:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:13.776 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[22fb0369-7f99-4a9b-bb25-92b2b31194be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:24:14 localhost systemd[1]: var-lib-containers-storage-overlay-7b764d84a17e63cb527ee485ce650e18010b7f5243457b2905eb41baa01435c8-merged.mount: Deactivated successfully. Dec 6 05:24:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d44ac8cf189b1172354fab8df704d90def7dff7bd62a20df288b33488a24f1fe-userdata-shm.mount: Deactivated successfully. Dec 6 05:24:14 localhost nova_compute[237281]: 2025-12-06 10:24:14.228 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:14 localhost systemd[1]: run-netns-qdhcp\x2d8ab7de36\x2ddad6\x2d4585\x2d818f\x2d9c4eaf4709d0.mount: Deactivated successfully. Dec 6 05:24:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:14.508 219384 INFO neutron.agent.dhcp.agent [None req-745a78b4-e53e-4638-a488-bd1552352d1e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:24:16 localhost openstack_network_exporter[199751]: ERROR 10:24:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:24:16 localhost openstack_network_exporter[199751]: ERROR 10:24:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:24:16 localhost openstack_network_exporter[199751]: ERROR 10:24:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:24:16 localhost openstack_network_exporter[199751]: ERROR 10:24:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:24:16 localhost openstack_network_exporter[199751]: Dec 6 05:24:16 localhost openstack_network_exporter[199751]: ERROR 10:24:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:24:16 localhost openstack_network_exporter[199751]: Dec 6 05:24:16 localhost nova_compute[237281]: 2025-12-06 10:24:16.292 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:17 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:17.958 219384 INFO neutron.agent.linux.ip_lib [None req-bd1a7e1e-7796-4551-9b5e-5a57257dc7ce - - - - - -] Device tapeee76c74-b6 cannot be used as it has no MAC address#033[00m Dec 6 05:24:18 localhost nova_compute[237281]: 2025-12-06 10:24:18.036 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:18 localhost kernel: device tapeee76c74-b6 entered promiscuous mode Dec 6 05:24:18 localhost nova_compute[237281]: 2025-12-06 10:24:18.046 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:18 localhost NetworkManager[5965]: [1765016658.0474] manager: (tapeee76c74-b6): new Generic device (/org/freedesktop/NetworkManager/Devices/61) Dec 6 05:24:18 localhost ovn_controller[131684]: 2025-12-06T10:24:18Z|00355|binding|INFO|Claiming lport eee76c74-b6d4-4e9f-a0de-94e513d37252 for this chassis. Dec 6 05:24:18 localhost ovn_controller[131684]: 2025-12-06T10:24:18Z|00356|binding|INFO|eee76c74-b6d4-4e9f-a0de-94e513d37252: Claiming unknown Dec 6 05:24:18 localhost systemd-udevd[259530]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:24:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:18.060 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-e4cbe437-288a-4b4c-96af-5e8fd0701fca', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4cbe437-288a-4b4c-96af-5e8fd0701fca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f53d29120b44434860c4dafb30d2afc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=84ef3307-1b37-4f06-b442-7886961b9f5a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eee76c74-b6d4-4e9f-a0de-94e513d37252) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:18.062 137259 INFO neutron.agent.ovn.metadata.agent [-] Port eee76c74-b6d4-4e9f-a0de-94e513d37252 in datapath e4cbe437-288a-4b4c-96af-5e8fd0701fca bound to our chassis#033[00m Dec 6 05:24:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:18.063 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e4cbe437-288a-4b4c-96af-5e8fd0701fca or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:24:18 localhost nova_compute[237281]: 2025-12-06 10:24:18.062 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:18.064 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1daf6e-61db-4f47-8321-b4444142eb23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost ovn_controller[131684]: 2025-12-06T10:24:18Z|00357|binding|INFO|Setting lport eee76c74-b6d4-4e9f-a0de-94e513d37252 ovn-installed in OVS Dec 6 05:24:18 localhost ovn_controller[131684]: 2025-12-06T10:24:18Z|00358|binding|INFO|Setting lport eee76c74-b6d4-4e9f-a0de-94e513d37252 up in Southbound Dec 6 05:24:18 localhost nova_compute[237281]: 2025-12-06 10:24:18.089 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost nova_compute[237281]: 2025-12-06 10:24:18.094 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost journal[186952]: ethtool ioctl error on tapeee76c74-b6: No such device Dec 6 05:24:18 localhost nova_compute[237281]: 2025-12-06 10:24:18.135 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:18 localhost nova_compute[237281]: 2025-12-06 10:24:18.170 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:18.550 2 INFO neutron.agent.securitygroups_rpc [None req-486d62ed-5e62-4648-9c4e-1a6bdd0722c4 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['341b0813-2c64-49b5-8bec-9f1bb31bad43', '95a47fad-13e6-44c0-a7f3-dd9cfcdf34a8', 'f8fdc862-a150-4ebe-a45a-cf407608047d']#033[00m Dec 6 05:24:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:18.941 2 INFO neutron.agent.securitygroups_rpc [None req-a2d3e2eb-cfe1-44ea-bac4-642cff883c47 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:24:19 localhost podman[259601]: Dec 6 05:24:19 localhost podman[259601]: 2025-12-06 10:24:19.155392728 +0000 UTC m=+0.100359352 container create 44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e4cbe437-288a-4b4c-96af-5e8fd0701fca, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 05:24:19 localhost systemd[1]: Started libpod-conmon-44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd.scope. Dec 6 05:24:19 localhost podman[259601]: 2025-12-06 10:24:19.108108922 +0000 UTC m=+0.053075596 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:24:19 localhost nova_compute[237281]: 2025-12-06 10:24:19.230 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:19 localhost systemd[1]: Started libcrun container. Dec 6 05:24:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6d5183432df8ebd12795f2559dae0862b1ba50f4e0f5c399b1843fda3f4f86f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:24:19 localhost podman[259601]: 2025-12-06 10:24:19.251891861 +0000 UTC m=+0.196858465 container init 44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e4cbe437-288a-4b4c-96af-5e8fd0701fca, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:24:19 localhost podman[259601]: 2025-12-06 10:24:19.260607089 +0000 UTC m=+0.205573693 container start 44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e4cbe437-288a-4b4c-96af-5e8fd0701fca, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:24:19 localhost dnsmasq[259619]: started, version 2.85 cachesize 150 Dec 6 05:24:19 localhost dnsmasq[259619]: DNS service limited to local subnets Dec 6 05:24:19 localhost dnsmasq[259619]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:24:19 localhost dnsmasq[259619]: warning: no upstream servers configured Dec 6 05:24:19 localhost dnsmasq-dhcp[259619]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:24:19 localhost dnsmasq[259619]: read /var/lib/neutron/dhcp/e4cbe437-288a-4b4c-96af-5e8fd0701fca/addn_hosts - 0 addresses Dec 6 05:24:19 localhost dnsmasq-dhcp[259619]: read /var/lib/neutron/dhcp/e4cbe437-288a-4b4c-96af-5e8fd0701fca/host Dec 6 05:24:19 localhost dnsmasq-dhcp[259619]: read /var/lib/neutron/dhcp/e4cbe437-288a-4b4c-96af-5e8fd0701fca/opts Dec 6 05:24:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:19.497 219384 INFO neutron.agent.dhcp.agent [None req-4861d242-fe69-4bd4-9801-00e91a68028e - - - - - -] DHCP configuration for ports {'b8784fbe-e238-41bb-a452-a7d99937d24a'} is completed#033[00m Dec 6 05:24:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:19.579 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:24:20 localhost ovn_controller[131684]: 2025-12-06T10:24:20Z|00359|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:24:20 localhost nova_compute[237281]: 2025-12-06 10:24:20.285 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:20 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:20.825 2 INFO neutron.agent.securitygroups_rpc [None req-0949c030-1949-471c-bcf6-747fdf2103d8 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['95a47fad-13e6-44c0-a7f3-dd9cfcdf34a8', 'f8fdc862-a150-4ebe-a45a-cf407608047d']#033[00m Dec 6 05:24:21 localhost nova_compute[237281]: 2025-12-06 10:24:21.328 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:21 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:21.884 2 INFO neutron.agent.securitygroups_rpc [None req-4201050d-1056-4c6a-9da2-9e7959b7214b 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:24:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:24:22 localhost podman[259620]: 2025-12-06 10:24:22.522574184 +0000 UTC m=+0.055788849 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public) Dec 6 05:24:22 localhost podman[259620]: 2025-12-06 10:24:22.536691469 +0000 UTC m=+0.069906144 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350) Dec 6 05:24:22 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:24:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:22.995 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:24:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:22.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:24:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:22.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.000 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb9e9b6c-73ee-4a1c-b5f8-71c8ad74b563', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:22.996872', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbb69690-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': 'ae912c4764442f616f22979ea08e04287301084a25166ab4ed3560529f84ea98'}]}, 'timestamp': '2025-12-06 10:24:23.001555', '_unique_id': '53d3664c9ca14fa9b75d1bee2a2bd5ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.003 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.037 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.037 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a307cd30-41dc-40a0-94d6-3d2e463a24ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.004651', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbbc1af2-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': 'f8f69e26f3db1e8551fbe64898281be9fa77687f0f7d7eb225d21bc366b49505'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.004651', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbbc2f38-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': 'f14c928229068019febc693a76e509482dbbf819c55562002d5cd633cb5790ac'}]}, 'timestamp': '2025-12-06 10:24:23.038148', '_unique_id': 'b284929e6be34b4f95651252b58609f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.039 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.040 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.054 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 19970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '346d1064-8a89-4c3d-87bf-1ee0b166b7c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19970000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:24:23.040807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bbbebc94-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.264130964, 'message_signature': '83299d621088dbc0e0785a08d09612f07ee3627084489302fe077538109f691e'}]}, 'timestamp': '2025-12-06 10:24:23.054980', '_unique_id': '2888336ea7354cfca542722cb6a4fd5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.056 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.057 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.057 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa4c3761-00ac-424d-835f-01edf08951bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:24:23.057673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bbbf3e8a-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.264130964, 'message_signature': 'e1358f4baac5e214aa3b38d31d17d6f01fc7e514735cedc8cd07e9ef1d303633'}]}, 'timestamp': '2025-12-06 10:24:23.058207', '_unique_id': '1f25fc3602ad46ffb0f05af2a815e0f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.059 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.060 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.060 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29674775-bc31-4dac-9029-8e43464b17e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.060459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbbfa8a2-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': '9267ff0bdcb5c43cc2f01bf9e407de311353fc53dfaa87a0b22e22763bc02063'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.060459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbbfbe3c-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': '97f9d659e8e3508bb2687c4d1ed64788d201c6f25432b6ff83a496cf95a03905'}]}, 'timestamp': '2025-12-06 10:24:23.061482', '_unique_id': '6184f698e99d47419b4e729c19b3ed38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.062 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.063 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.064 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ca9f20f-a68b-41ec-a609-d210f6a21ee2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.063733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbc02976-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': 'eba37e2d9a73c8ca23526b6473fce33de01dadf844b37e4f9324c15e28ad1396'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.063733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbc03a24-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': '09a733b1effb70d7919173aebad3afbf3d8a4f2c6e8a944b45bc4e3446e1e22e'}]}, 'timestamp': '2025-12-06 10:24:23.064637', '_unique_id': 'dbf0647adcb441409c79cbfef0a5d787'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.065 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.066 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.066 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.067 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61bcac53-1a9b-41dc-9a1f-5ede65521ae2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.066882', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbc0a32e-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': '47526f8575284917b64a6a90754fe2e82df8e65f7fa8cdb399491571bc90688e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.066882', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbc0b3b4-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': 'd519fdecab24328b2a8798b540faa4149cd0d2f92e4fa6978ce429382647be1c'}]}, 'timestamp': '2025-12-06 10:24:23.067772', '_unique_id': '6d11852eeb46465e81f3a8fb15d63d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.068 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.070 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f79df3b6-a4b6-4ae7-98c2-15803249672b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.069996', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc11d22-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': '070b241512bf75932f16c8216c4dc8fe4b187e1543614f491189ccd625d82a67'}]}, 'timestamp': '2025-12-06 10:24:23.070473', '_unique_id': '1b3a93471995464b9455c8d9fdcb0f3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.071 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.072 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.072 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af666827-5237-4426-8c6c-573637153afa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.072664', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc18618-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': '6f8aa512bd2e0f7f422be3c7b859d53dc32e2f76536c092fba765a456c67d91b'}]}, 'timestamp': '2025-12-06 10:24:23.073161', '_unique_id': '6c7bcaf37d7d46f8a0446e3cce4dd5ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.074 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.075 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.075 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.075 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f497e09-19b1-442a-9698-13dad4a7bfed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.075431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbc1f152-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': 'ce0924edc17f6eb4a9ad0d463790dc88c4e0b17ea082ba3a12bef75eea7fbadd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.075431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbc203e0-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': '4e0d2dbc4e2f753755d403b16d8a026adc54b1e82a3325dcb4952a9ac255ff6f'}]}, 'timestamp': '2025-12-06 10:24:23.076348', '_unique_id': '85f3a9617a4649a3aadb1758ec0328f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.078 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.078 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d625243-87ac-41b5-a7b5-9e6b07299e99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.078835', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc277a8-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': 'b896d4d77b4511baef9b2e9d2910a2086a212403325270f372fd2a5231c790e1'}]}, 'timestamp': '2025-12-06 10:24:23.079345', '_unique_id': '3a891bc43b534548b610d32f61ea4f9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.080 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.081 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b07842e-7d84-4fc7-95f8-3cd2680a7cbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.081747', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc2e954-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': '8d3cd98d1fca96930be9f210fd58b1eb750f6a77431d6e5dae21d5443d256e96'}]}, 'timestamp': '2025-12-06 10:24:23.082257', '_unique_id': '8f057f6a517d48f1a9e626fa26153a6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.084 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74565cb2-4b83-4ffa-9af6-a77aba05a4a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.084408', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc34fa2-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': '84bf315063662596b73e95f02e11a2cc0067c6659f3694fa78d801e493b5cd9c'}]}, 'timestamp': '2025-12-06 10:24:23.084908', '_unique_id': '4270b67eb37d45cc84724192934b9de8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.087 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ead72a97-4e45-4edb-8ebe-3a504c6917df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.087495', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc3c932-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': 'dcb2d4b0f84fa504d305203bb0517cbbe60edaf541a21dc00a36fde93512bbbf'}]}, 'timestamp': '2025-12-06 10:24:23.088016', '_unique_id': 'cb430578ae5d4144a6753eb78e559980'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.088 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.090 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef6728cf-aa47-4c7f-b258-bdc9531f7cc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.090166', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc43084-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': 'ee8187cac633801270800d6644a426f1c7089fe8da5663f6939eee70dba4c513'}]}, 'timestamp': '2025-12-06 10:24:23.090629', '_unique_id': 'd874e59d6dd14467b71ddf42c38fe33a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fec0a6bf-defe-4318-8ec2-e56cf87ca7b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.093767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbc71556-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.30396253, 'message_signature': '17a84f9c5b7a107332711a3865728d19b215e7cfaefc1e77bf10b604e35005e7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.093767', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbc72172-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.30396253, 'message_signature': 'b6da9a439308d127559b0c41e5637ed6948f967aa54d663b5fd80e7002b25636'}]}, 'timestamp': '2025-12-06 10:24:23.109893', '_unique_id': 'cc080e013c0c4265b22c9452a3d58067'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.110 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.111 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da8448e7-1fd0-4654-94d7-c5a786afd0d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.111920', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc77f96-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': '3bf5103661a5ce3f2c4165324776d72649880a5bedadd70e4d237684658391cf'}]}, 'timestamp': '2025-12-06 10:24:23.112226', '_unique_id': '4e02c3a7a9a946f7b774f8602ceeda59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.112 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.113 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16ed0786-e2c3-449c-be0d-e4e379571124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.113580', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbc7bfec-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': '4c52c603422228e96530dd332ff9996a8e178c39aa95fa6619c3ab407e40074a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.113580', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbc7cabe-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.214761183, 'message_signature': '9afc5eb40a55ed754ecc9bf4345d77d9ff4014dc4af82ebf95b3ad95dfe4ff0e'}]}, 'timestamp': '2025-12-06 10:24:23.114130', '_unique_id': '0385b95273b44d188744bd6300b25680'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03775025-4e9c-45f1-877f-a6904e0345c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.115664', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbc81140-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.30396253, 'message_signature': 'b56c4684d957b8f925eb8c12446cdc5e5d877317aa8cc3b867c3ae55276f90a6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.115664', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbc81c08-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.30396253, 'message_signature': 'fc203911eb9ac97b9226aa15a5c7b9cba2b00718b511a5b0e72f0aab442e33a9'}]}, 'timestamp': '2025-12-06 10:24:23.116211', '_unique_id': 'a337f1f1a0d84d8595c923183c984ad9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.117 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e9f8f65-ac6e-4c1d-ae1e-0f5937ef085e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:24:23.117562', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bbc85b46-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.30396253, 'message_signature': '67e01b645c4794b4ae968432e17227f588bae9f1deb24ec5dab113bb4c590b23'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:24:23.117562', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bbc8665e-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.30396253, 'message_signature': '2648e5bbf8f39e10ba6353d5d61e0853d40dbacef55c680f83b6ae1269eb77db'}]}, 'timestamp': '2025-12-06 10:24:23.118115', '_unique_id': 'bb1f1bbdf744477b900a24cbd13cf01c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.118 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.119 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.119 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b8b831e-0810-4fba-8ebe-ffb724301318', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:24:23.119445', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': 'bbc8a4fc-d28d-11f0-8fed-fa163edf398d', 'monotonic_time': 13050.206984923, 'message_signature': '0381356ce8d5b223550a2922ed6565489d45f38a29f3b1986e53af51963283a1'}]}, 'timestamp': '2025-12-06 10:24:23.119760', '_unique_id': '2785765e631c4c2bb49ed2734192bcfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:24:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:24:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:24:23 localhost podman[197801]: time="2025-12-06T10:24:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:24:23 localhost podman[197801]: @ - - [06/Dec/2025:10:24:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147791 "" "Go-http-client/1.1" Dec 6 05:24:23 localhost podman[197801]: @ - - [06/Dec/2025:10:24:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16922 "" "Go-http-client/1.1" Dec 6 05:24:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40251 DF PROTO=TCP SPT=49500 DPT=9102 SEQ=4247368960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDF37CA0000000001030307) Dec 6 05:24:24 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:24.069 2 INFO neutron.agent.securitygroups_rpc [None req-a470f28e-9b45-440d-b961-9e083c773f47 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:24:24 localhost nova_compute[237281]: 2025-12-06 10:24:24.235 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:24 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:24.545 219384 INFO neutron.agent.linux.ip_lib [None req-50aa8ee0-1a03-4709-8727-3d08d11a42b2 - - - - - -] Device tapea684f74-23 cannot be used as it has no MAC address#033[00m Dec 6 05:24:24 localhost nova_compute[237281]: 2025-12-06 10:24:24.572 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:24 localhost kernel: device tapea684f74-23 entered promiscuous mode Dec 6 05:24:24 localhost ovn_controller[131684]: 2025-12-06T10:24:24Z|00360|binding|INFO|Claiming lport ea684f74-2312-45ee-baba-757d986db059 for this chassis. Dec 6 05:24:24 localhost ovn_controller[131684]: 2025-12-06T10:24:24Z|00361|binding|INFO|ea684f74-2312-45ee-baba-757d986db059: Claiming unknown Dec 6 05:24:24 localhost NetworkManager[5965]: [1765016664.5842] manager: (tapea684f74-23): new Generic device (/org/freedesktop/NetworkManager/Devices/62) Dec 6 05:24:24 localhost nova_compute[237281]: 2025-12-06 10:24:24.582 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:24 localhost systemd-udevd[259653]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:24:24 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:24.596 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-2c7e32f0-c159-422c-a63a-fa11068940d1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c7e32f0-c159-422c-a63a-fa11068940d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09f56fbf-c4d1-4cbd-8dbf-48a2c2fccc3f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ea684f74-2312-45ee-baba-757d986db059) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:24 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:24.598 137259 INFO neutron.agent.ovn.metadata.agent [-] Port ea684f74-2312-45ee-baba-757d986db059 in datapath 2c7e32f0-c159-422c-a63a-fa11068940d1 bound to our chassis#033[00m Dec 6 05:24:24 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:24.599 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2c7e32f0-c159-422c-a63a-fa11068940d1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:24:24 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:24.603 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e95e57-5a2c-49a0-b10b-c5c2dd3df230]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost ovn_controller[131684]: 2025-12-06T10:24:24Z|00362|binding|INFO|Setting lport ea684f74-2312-45ee-baba-757d986db059 ovn-installed in OVS Dec 6 05:24:24 localhost ovn_controller[131684]: 2025-12-06T10:24:24Z|00363|binding|INFO|Setting lport ea684f74-2312-45ee-baba-757d986db059 up in Southbound Dec 6 05:24:24 localhost nova_compute[237281]: 2025-12-06 10:24:24.615 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:24 localhost nova_compute[237281]: 2025-12-06 10:24:24.616 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost journal[186952]: ethtool ioctl error on tapea684f74-23: No such device Dec 6 05:24:24 localhost nova_compute[237281]: 2025-12-06 10:24:24.654 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:24 localhost nova_compute[237281]: 2025-12-06 10:24:24.683 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40252 DF PROTO=TCP SPT=49500 DPT=9102 SEQ=4247368960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDF3BC70000000001030307) Dec 6 05:24:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:24:25 localhost podman[259717]: 2025-12-06 10:24:25.563212905 +0000 UTC m=+0.081858653 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:24:25 localhost podman[259717]: 2025-12-06 10:24:25.57832873 +0000 UTC m=+0.096974418 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:24:25 localhost podman[259733]: Dec 6 05:24:25 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:24:25 localhost podman[259733]: 2025-12-06 10:24:25.602704871 +0000 UTC m=+0.086882007 container create b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7e32f0-c159-422c-a63a-fa11068940d1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:24:25 localhost systemd[1]: Started libpod-conmon-b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571.scope. Dec 6 05:24:25 localhost podman[259733]: 2025-12-06 10:24:25.551335948 +0000 UTC m=+0.035513114 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:24:25 localhost systemd[1]: Started libcrun container. Dec 6 05:24:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc4632441900b912bf5493acfd6fff09d8e9e876d50dae2187461bc19ba120f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:24:25 localhost podman[259733]: 2025-12-06 10:24:25.682415786 +0000 UTC m=+0.166592932 container init b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7e32f0-c159-422c-a63a-fa11068940d1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:24:25 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:25.689 2 INFO neutron.agent.securitygroups_rpc [None req-bab8122b-50f1-4596-9235-1cc46d009b17 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:24:25 localhost podman[259733]: 2025-12-06 10:24:25.694121586 +0000 UTC m=+0.178298722 container start b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7e32f0-c159-422c-a63a-fa11068940d1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:24:25 localhost dnsmasq[259765]: started, version 2.85 cachesize 150 Dec 6 05:24:25 localhost dnsmasq[259765]: DNS service limited to local subnets Dec 6 05:24:25 localhost dnsmasq[259765]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:24:25 localhost dnsmasq[259765]: warning: no upstream servers configured Dec 6 05:24:25 localhost dnsmasq-dhcp[259765]: DHCP, static leases only on 10.100.255.240, lease time 1d Dec 6 05:24:25 localhost dnsmasq[259765]: read /var/lib/neutron/dhcp/2c7e32f0-c159-422c-a63a-fa11068940d1/addn_hosts - 0 addresses Dec 6 05:24:25 localhost dnsmasq-dhcp[259765]: read /var/lib/neutron/dhcp/2c7e32f0-c159-422c-a63a-fa11068940d1/host Dec 6 05:24:25 localhost dnsmasq-dhcp[259765]: read /var/lib/neutron/dhcp/2c7e32f0-c159-422c-a63a-fa11068940d1/opts Dec 6 05:24:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44831 DF PROTO=TCP SPT=50190 DPT=9102 SEQ=1098211388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDF3F870000000001030307) Dec 6 05:24:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:26.056 219384 INFO neutron.agent.dhcp.agent [None req-d03f7539-84b3-4fa8-8591-bc040b421a2c - - - - - -] DHCP configuration for ports {'fd2942fb-792e-429f-8853-3707b26dd3f8'} is completed#033[00m Dec 6 05:24:26 localhost nova_compute[237281]: 2025-12-06 10:24:26.331 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40253 DF PROTO=TCP SPT=49500 DPT=9102 SEQ=4247368960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDF43C80000000001030307) Dec 6 05:24:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9829 DF PROTO=TCP SPT=36902 DPT=9102 SEQ=3511509215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDF47870000000001030307) Dec 6 05:24:29 localhost nova_compute[237281]: 2025-12-06 10:24:29.237 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40254 DF PROTO=TCP SPT=49500 DPT=9102 SEQ=4247368960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDF53880000000001030307) Dec 6 05:24:31 localhost nova_compute[237281]: 2025-12-06 10:24:31.364 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:31 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:31.469 2 INFO neutron.agent.securitygroups_rpc [None req-b5bcbf33-c4c5-4515-83d3-1ae065807a48 315bb119f6eb4515bad61a1e680591aa 0901295e4a3a44e89ad3e6a450608d11 - - default default] Security group member updated ['2ca28ccb-fe1d-4c55-8694-8a400ec0b547']#033[00m Dec 6 05:24:33 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:33.026 219384 INFO neutron.agent.linux.ip_lib [None req-eae3168b-3d28-44b9-8acb-f1fc012f9e8b - - - - - -] Device tap62120110-32 cannot be used as it has no MAC address#033[00m Dec 6 05:24:33 localhost nova_compute[237281]: 2025-12-06 10:24:33.049 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:33 localhost kernel: device tap62120110-32 entered promiscuous mode Dec 6 05:24:33 localhost ovn_controller[131684]: 2025-12-06T10:24:33Z|00364|binding|INFO|Claiming lport 62120110-32ae-46d0-9764-eeec736b7e59 for this chassis. Dec 6 05:24:33 localhost NetworkManager[5965]: [1765016673.0601] manager: (tap62120110-32): new Generic device (/org/freedesktop/NetworkManager/Devices/63) Dec 6 05:24:33 localhost ovn_controller[131684]: 2025-12-06T10:24:33Z|00365|binding|INFO|62120110-32ae-46d0-9764-eeec736b7e59: Claiming unknown Dec 6 05:24:33 localhost nova_compute[237281]: 2025-12-06 10:24:33.060 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:33 localhost systemd-udevd[259776]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost ovn_controller[131684]: 2025-12-06T10:24:33Z|00366|binding|INFO|Setting lport 62120110-32ae-46d0-9764-eeec736b7e59 ovn-installed in OVS Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost nova_compute[237281]: 2025-12-06 10:24:33.104 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost ovn_controller[131684]: 2025-12-06T10:24:33Z|00367|binding|INFO|Setting lport 62120110-32ae-46d0-9764-eeec736b7e59 up in Southbound Dec 6 05:24:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:33.123 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-6ec728ba-7c33-4201-8805-5c8420d1aab7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ec728ba-7c33-4201-8805-5c8420d1aab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=050fa73b-2bb9-40dc-bc3a-6fe0e6304c7f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=62120110-32ae-46d0-9764-eeec736b7e59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:33.126 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 62120110-32ae-46d0-9764-eeec736b7e59 in datapath 6ec728ba-7c33-4201-8805-5c8420d1aab7 bound to our chassis#033[00m Dec 6 05:24:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:33.129 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6ec728ba-7c33-4201-8805-5c8420d1aab7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:33.132 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c09c5214-4d6d-4398-a966-80d916cb6248]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:24:33 localhost journal[186952]: ethtool ioctl error on tap62120110-32: No such device Dec 6 05:24:33 localhost nova_compute[237281]: 2025-12-06 10:24:33.155 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:33 localhost nova_compute[237281]: 2025-12-06 10:24:33.188 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:34 localhost nova_compute[237281]: 2025-12-06 10:24:34.238 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:24:34 localhost systemd[1]: tmp-crun.EgpUw2.mount: Deactivated successfully. Dec 6 05:24:34 localhost podman[259822]: 2025-12-06 10:24:34.56565523 +0000 UTC m=+0.099669462 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Dec 6 05:24:34 localhost podman[259822]: 2025-12-06 10:24:34.630735014 +0000 UTC m=+0.164749256 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 6 05:24:34 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:24:35 localhost podman[259874]: Dec 6 05:24:35 localhost podman[259874]: 2025-12-06 10:24:35.057295462 +0000 UTC m=+0.089797376 container create 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:24:35 localhost systemd[1]: Started libpod-conmon-1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409.scope. Dec 6 05:24:35 localhost podman[259874]: 2025-12-06 10:24:35.01180314 +0000 UTC m=+0.044305074 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:24:35 localhost systemd[1]: Started libcrun container. Dec 6 05:24:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e3143ae6a501b41e363bd3108d8083e1033d413f63641c7da2e32df80a2a307/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:24:35 localhost podman[259874]: 2025-12-06 10:24:35.148894153 +0000 UTC m=+0.181396077 container init 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:24:35 localhost podman[259874]: 2025-12-06 10:24:35.159064526 +0000 UTC m=+0.191566440 container start 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:24:35 localhost dnsmasq[259893]: started, version 2.85 cachesize 150 Dec 6 05:24:35 localhost dnsmasq[259893]: DNS service limited to local subnets Dec 6 05:24:35 localhost dnsmasq[259893]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:24:35 localhost dnsmasq[259893]: warning: no upstream servers configured Dec 6 05:24:35 localhost dnsmasq-dhcp[259893]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:24:35 localhost dnsmasq[259893]: read /var/lib/neutron/dhcp/6ec728ba-7c33-4201-8805-5c8420d1aab7/addn_hosts - 0 addresses Dec 6 05:24:35 localhost dnsmasq-dhcp[259893]: read /var/lib/neutron/dhcp/6ec728ba-7c33-4201-8805-5c8420d1aab7/host Dec 6 05:24:35 localhost dnsmasq-dhcp[259893]: read /var/lib/neutron/dhcp/6ec728ba-7c33-4201-8805-5c8420d1aab7/opts Dec 6 05:24:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:35.502 219384 INFO neutron.agent.dhcp.agent [None req-3e967b4f-84d5-4a38-9b13-ec73dfbecef5 - - - - - -] DHCP configuration for ports {'78b7bc0f-4dc6-4ecc-83dc-ffef4c3bdc61'} is completed#033[00m Dec 6 05:24:36 localhost nova_compute[237281]: 2025-12-06 10:24:36.367 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:38 localhost ovn_controller[131684]: 2025-12-06T10:24:38Z|00368|binding|INFO|Releasing lport 62120110-32ae-46d0-9764-eeec736b7e59 from this chassis (sb_readonly=0) Dec 6 05:24:38 localhost kernel: device tap62120110-32 left promiscuous mode Dec 6 05:24:38 localhost ovn_controller[131684]: 2025-12-06T10:24:38Z|00369|binding|INFO|Setting lport 62120110-32ae-46d0-9764-eeec736b7e59 down in Southbound Dec 6 05:24:38 localhost nova_compute[237281]: 2025-12-06 10:24:38.830 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:24:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:24:38 localhost nova_compute[237281]: 2025-12-06 10:24:38.860 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:38 localhost nova_compute[237281]: 2025-12-06 10:24:38.862 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:38 localhost podman[259896]: 2025-12-06 10:24:38.952104549 +0000 UTC m=+0.087672661 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:24:38 localhost podman[259896]: 2025-12-06 10:24:38.965239044 +0000 UTC m=+0.100807176 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:24:38 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:24:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:39.025 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-6ec728ba-7c33-4201-8805-5c8420d1aab7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ec728ba-7c33-4201-8805-5c8420d1aab7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=050fa73b-2bb9-40dc-bc3a-6fe0e6304c7f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=62120110-32ae-46d0-9764-eeec736b7e59) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:39.027 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 62120110-32ae-46d0-9764-eeec736b7e59 in datapath 6ec728ba-7c33-4201-8805-5c8420d1aab7 unbound from our chassis#033[00m Dec 6 05:24:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:39.030 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ec728ba-7c33-4201-8805-5c8420d1aab7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:24:39 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:39.031 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[09312eeb-da94-46d3-a57e-39e5ab350333]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:24:39 localhost podman[259897]: 2025-12-06 10:24:39.054378648 +0000 UTC m=+0.186978079 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:24:39 localhost podman[259897]: 2025-12-06 10:24:39.067755561 +0000 UTC m=+0.200354972 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:24:39 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:24:39 localhost nova_compute[237281]: 2025-12-06 10:24:39.239 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40255 DF PROTO=TCP SPT=49500 DPT=9102 SEQ=4247368960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDF73880000000001030307) Dec 6 05:24:40 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=64.62.197.29 DST=38.129.56.147 LEN=40 TOS=0x00 PREC=0x00 TTL=236 ID=54321 PROTO=TCP SPT=48279 DPT=9090 SEQ=1351748178 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 Dec 6 05:24:41 localhost nova_compute[237281]: 2025-12-06 10:24:41.369 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:42 localhost dnsmasq[259893]: read /var/lib/neutron/dhcp/6ec728ba-7c33-4201-8805-5c8420d1aab7/addn_hosts - 0 addresses Dec 6 05:24:42 localhost podman[259951]: 2025-12-06 10:24:42.097916796 +0000 UTC m=+0.059636917 container kill 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:24:42 localhost dnsmasq-dhcp[259893]: read /var/lib/neutron/dhcp/6ec728ba-7c33-4201-8805-5c8420d1aab7/host Dec 6 05:24:42 localhost dnsmasq-dhcp[259893]: read /var/lib/neutron/dhcp/6ec728ba-7c33-4201-8805-5c8420d1aab7/opts Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent [None req-f36ecd46-5b4d-49cf-a240-d26dfedb0084 - - - - - -] Unable to reload_allocations dhcp for 6ec728ba-7c33-4201-8805-5c8420d1aab7.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap62120110-32 not found in namespace qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7. Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent return fut.result() Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent raise self._exception Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap62120110-32 not found in namespace qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7. Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.124 219384 ERROR neutron.agent.dhcp.agent #033[00m Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.130 219384 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.448 219384 INFO neutron.agent.dhcp.agent [None req-59568fd1-48e4-42d1-aa5b-602d7666bcc4 - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.449 219384 INFO neutron.agent.dhcp.agent [-] Starting network 6ec728ba-7c33-4201-8805-5c8420d1aab7 dhcp configuration#033[00m Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.449 219384 INFO neutron.agent.dhcp.agent [-] Finished network 6ec728ba-7c33-4201-8805-5c8420d1aab7 dhcp configuration#033[00m Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.450 219384 INFO neutron.agent.dhcp.agent [-] Starting network 9a9f536a-4201-4d67-a433-6077de86991e dhcp configuration#033[00m Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.450 219384 INFO neutron.agent.dhcp.agent [-] Finished network 9a9f536a-4201-4d67-a433-6077de86991e dhcp configuration#033[00m Dec 6 05:24:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:42.450 219384 INFO neutron.agent.dhcp.agent [None req-59568fd1-48e4-42d1-aa5b-602d7666bcc4 - - - - - -] Synchronizing state complete#033[00m Dec 6 05:24:42 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:42.547 2 INFO neutron.agent.securitygroups_rpc [None req-d99f57f4-42c6-4f1a-936c-9417abdd6446 f8ff3af811504c0182e5fd693115a79f 523e63dfdf8146788c58dbce1e413d7c - - default default] Security group member updated ['c63e68be-6bbe-4d91-8fc2-8cf666480313']#033[00m Dec 6 05:24:42 localhost ovn_controller[131684]: 2025-12-06T10:24:42Z|00370|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:24:42 localhost dnsmasq[259893]: exiting on receipt of SIGTERM Dec 6 05:24:42 localhost podman[259981]: 2025-12-06 10:24:42.726433224 +0000 UTC m=+0.098218506 container kill 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:24:42 localhost nova_compute[237281]: 2025-12-06 10:24:42.724 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:42 localhost systemd[1]: libpod-1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409.scope: Deactivated successfully. Dec 6 05:24:42 localhost podman[259993]: 2025-12-06 10:24:42.798745471 +0000 UTC m=+0.057345807 container died 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:24:42 localhost podman[259993]: 2025-12-06 10:24:42.836343568 +0000 UTC m=+0.094943874 container cleanup 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:24:42 localhost systemd[1]: libpod-conmon-1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409.scope: Deactivated successfully. Dec 6 05:24:42 localhost podman[259995]: 2025-12-06 10:24:42.877427014 +0000 UTC m=+0.129978054 container remove 1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6ec728ba-7c33-4201-8805-5c8420d1aab7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:24:43 localhost systemd[1]: var-lib-containers-storage-overlay-7e3143ae6a501b41e363bd3108d8083e1033d413f63641c7da2e32df80a2a307-merged.mount: Deactivated successfully. Dec 6 05:24:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ca0130cbafcb1a8497bdde1eff90d2251ede64c7dc3e0cbdc8fe13866b64409-userdata-shm.mount: Deactivated successfully. Dec 6 05:24:43 localhost systemd[1]: run-netns-qdhcp\x2d6ec728ba\x2d7c33\x2d4201\x2d8805\x2d5c8420d1aab7.mount: Deactivated successfully. Dec 6 05:24:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:24:43 localhost podman[260023]: 2025-12-06 10:24:43.205512899 +0000 UTC m=+0.080614084 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:24:43 localhost podman[260023]: 2025-12-06 10:24:43.235880004 +0000 UTC m=+0.110981159 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Dec 6 05:24:43 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:24:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:24:43 localhost podman[260041]: 2025-12-06 10:24:43.352310271 +0000 UTC m=+0.079274294 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:24:43 localhost podman[260041]: 2025-12-06 10:24:43.390239538 +0000 UTC m=+0.117203491 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 05:24:43 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:24:44 localhost nova_compute[237281]: 2025-12-06 10:24:44.241 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:44 localhost nova_compute[237281]: 2025-12-06 10:24:44.854 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:45 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:45.944 2 INFO neutron.agent.securitygroups_rpc [None req-9156cff7-7197-43ba-8168-1f7b6837d818 f8ff3af811504c0182e5fd693115a79f 523e63dfdf8146788c58dbce1e413d7c - - default default] Security group member updated ['c63e68be-6bbe-4d91-8fc2-8cf666480313']#033[00m Dec 6 05:24:46 localhost openstack_network_exporter[199751]: ERROR 10:24:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:24:46 localhost openstack_network_exporter[199751]: ERROR 10:24:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:24:46 localhost openstack_network_exporter[199751]: ERROR 10:24:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:24:46 localhost openstack_network_exporter[199751]: ERROR 10:24:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:24:46 localhost openstack_network_exporter[199751]: Dec 6 05:24:46 localhost openstack_network_exporter[199751]: ERROR 10:24:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:24:46 localhost openstack_network_exporter[199751]: Dec 6 05:24:46 localhost nova_compute[237281]: 2025-12-06 10:24:46.371 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:46 localhost nova_compute[237281]: 2025-12-06 10:24:46.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:47 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:47.019 2 INFO neutron.agent.securitygroups_rpc [None req-e617f4f6-03ad-4d68-9304-a1d78f313f85 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:24:49 localhost nova_compute[237281]: 2025-12-06 10:24:49.243 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:49 localhost nova_compute[237281]: 2025-12-06 10:24:49.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:50 localhost nova_compute[237281]: 2025-12-06 10:24:50.567 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:51 localhost nova_compute[237281]: 2025-12-06 10:24:51.375 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:51 localhost nova_compute[237281]: 2025-12-06 10:24:51.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:52 localhost nova_compute[237281]: 2025-12-06 10:24:52.882 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:53 localhost podman[197801]: time="2025-12-06T10:24:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:24:53 localhost podman[197801]: @ - - [06/Dec/2025:10:24:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149619 "" "Go-http-client/1.1" Dec 6 05:24:53 localhost podman[197801]: @ - - [06/Dec/2025:10:24:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17404 "" "Go-http-client/1.1" Dec 6 05:24:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:24:53 localhost podman[260060]: 2025-12-06 10:24:53.555131578 +0000 UTC m=+0.085383462 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 6 05:24:53 localhost podman[260060]: 2025-12-06 10:24:53.572537973 +0000 UTC m=+0.102789897 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container) Dec 6 05:24:53 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:24:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57182 DF PROTO=TCP SPT=47484 DPT=9102 SEQ=3176948678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDFACFA0000000001030307) Dec 6 05:24:54 localhost nova_compute[237281]: 2025-12-06 10:24:54.244 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:54 localhost nova_compute[237281]: 2025-12-06 10:24:54.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57183 DF PROTO=TCP SPT=47484 DPT=9102 SEQ=3176948678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDFB1080000000001030307) Dec 6 05:24:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40256 DF PROTO=TCP SPT=49500 DPT=9102 SEQ=4247368960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDFB3880000000001030307) Dec 6 05:24:55 localhost neutron_sriov_agent[212548]: 2025-12-06 10:24:55.813 2 INFO neutron.agent.securitygroups_rpc [None req-7d53da39-faf3-4134-a0c5-6eec89836ce9 1cca9cb52a1c4c92b9e09d2994dafd54 5fd72013d35f4d0ca9dada3b23455c73 - - default default] Security group rule updated ['c2e76196-55de-49bd-9142-8b64d06d22bf']#033[00m Dec 6 05:24:55 localhost nova_compute[237281]: 2025-12-06 10:24:55.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:55 localhost nova_compute[237281]: 2025-12-06 10:24:55.885 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:24:56 localhost nova_compute[237281]: 2025-12-06 10:24:56.378 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:24:56 localhost podman[260081]: 2025-12-06 10:24:56.544372884 +0000 UTC m=+0.077973973 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:24:56 localhost podman[260081]: 2025-12-06 10:24:56.576484632 +0000 UTC m=+0.110085761 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:24:56 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:24:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57184 DF PROTO=TCP SPT=47484 DPT=9102 SEQ=3176948678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDFB9070000000001030307) Dec 6 05:24:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44832 DF PROTO=TCP SPT=50190 DPT=9102 SEQ=1098211388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDFBD880000000001030307) Dec 6 05:24:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:24:58.943 219384 INFO neutron.agent.linux.ip_lib [None req-181f06b8-9b7d-4ea8-9a3e-96e5fe9bd3bb - - - - - -] Device tapa47e84c3-ee cannot be used as it has no MAC address#033[00m Dec 6 05:24:58 localhost nova_compute[237281]: 2025-12-06 10:24:58.967 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:58 localhost kernel: device tapa47e84c3-ee entered promiscuous mode Dec 6 05:24:58 localhost NetworkManager[5965]: [1765016698.9770] manager: (tapa47e84c3-ee): new Generic device (/org/freedesktop/NetworkManager/Devices/64) Dec 6 05:24:58 localhost nova_compute[237281]: 2025-12-06 10:24:58.978 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:58 localhost ovn_controller[131684]: 2025-12-06T10:24:58Z|00371|binding|INFO|Claiming lport a47e84c3-ee53-41c0-8fdf-3f33f89ebebe for this chassis. Dec 6 05:24:58 localhost ovn_controller[131684]: 2025-12-06T10:24:58Z|00372|binding|INFO|a47e84c3-ee53-41c0-8fdf-3f33f89ebebe: Claiming unknown Dec 6 05:24:58 localhost systemd-udevd[260114]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost ovn_controller[131684]: 2025-12-06T10:24:59Z|00373|binding|INFO|Setting lport a47e84c3-ee53-41c0-8fdf-3f33f89ebebe ovn-installed in OVS Dec 6 05:24:59 localhost nova_compute[237281]: 2025-12-06 10:24:59.010 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost nova_compute[237281]: 2025-12-06 10:24:59.048 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:59 localhost journal[186952]: ethtool ioctl error on tapa47e84c3-ee: No such device Dec 6 05:24:59 localhost ovn_controller[131684]: 2025-12-06T10:24:59Z|00374|binding|INFO|Setting lport a47e84c3-ee53-41c0-8fdf-3f33f89ebebe up in Southbound Dec 6 05:24:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:59.065 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9817d242-662d-4582-8688-d255fc1e06de', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9817d242-662d-4582-8688-d255fc1e06de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5edd94b688c144078712f567f790b3e9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=84f57985-5441-4394-8428-4c50a8122e29, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a47e84c3-ee53-41c0-8fdf-3f33f89ebebe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:24:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:59.067 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a47e84c3-ee53-41c0-8fdf-3f33f89ebebe in datapath 9817d242-662d-4582-8688-d255fc1e06de bound to our chassis#033[00m Dec 6 05:24:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:59.069 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9817d242-662d-4582-8688-d255fc1e06de or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:24:59 localhost ovn_metadata_agent[137254]: 2025-12-06 10:24:59.070 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[198911bb-510c-43db-8eef-5fc265ab0609]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:24:59 localhost nova_compute[237281]: 2025-12-06 10:24:59.083 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:59 localhost nova_compute[237281]: 2025-12-06 10:24:59.245 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:24:59 localhost nova_compute[237281]: 2025-12-06 10:24:59.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:24:59 localhost nova_compute[237281]: 2025-12-06 10:24:59.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:24:59 localhost nova_compute[237281]: 2025-12-06 10:24:59.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:24:59 localhost podman[260185]: Dec 6 05:24:59 localhost podman[260185]: 2025-12-06 10:24:59.992596866 +0000 UTC m=+0.094129501 container create 9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9817d242-662d-4582-8688-d255fc1e06de, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:25:00 localhost systemd[1]: Started libpod-conmon-9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e.scope. Dec 6 05:25:00 localhost podman[260185]: 2025-12-06 10:24:59.944773073 +0000 UTC m=+0.046305748 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:00 localhost systemd[1]: Started libcrun container. Dec 6 05:25:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f41295a9174b7be42f1016ee9145cdf564f6dccf725c161ab953f6544fedeb67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:00 localhost podman[260185]: 2025-12-06 10:25:00.069493024 +0000 UTC m=+0.171025649 container init 9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9817d242-662d-4582-8688-d255fc1e06de, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:00 localhost podman[260185]: 2025-12-06 10:25:00.080150292 +0000 UTC m=+0.181682927 container start 9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9817d242-662d-4582-8688-d255fc1e06de, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:25:00 localhost dnsmasq[260204]: started, version 2.85 cachesize 150 Dec 6 05:25:00 localhost dnsmasq[260204]: DNS service limited to local subnets Dec 6 05:25:00 localhost dnsmasq[260204]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:00 localhost dnsmasq[260204]: warning: no upstream servers configured Dec 6 05:25:00 localhost dnsmasq-dhcp[260204]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:00 localhost dnsmasq[260204]: read /var/lib/neutron/dhcp/9817d242-662d-4582-8688-d255fc1e06de/addn_hosts - 0 addresses Dec 6 05:25:00 localhost dnsmasq-dhcp[260204]: read /var/lib/neutron/dhcp/9817d242-662d-4582-8688-d255fc1e06de/host Dec 6 05:25:00 localhost dnsmasq-dhcp[260204]: read /var/lib/neutron/dhcp/9817d242-662d-4582-8688-d255fc1e06de/opts Dec 6 05:25:00 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:00.576 2 INFO neutron.agent.securitygroups_rpc [None req-9a3047f4-7cc2-465d-8927-04cf81a1364d 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:00 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:00.722 219384 INFO neutron.agent.dhcp.agent [None req-a11fc062-e0ad-4faf-bb2b-991bf698f554 - - - - - -] DHCP configuration for ports {'25ef3445-16de-4491-bfd4-408b802ddb0a'} is completed#033[00m Dec 6 05:25:00 localhost sshd[260205]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:25:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57185 DF PROTO=TCP SPT=47484 DPT=9102 SEQ=3176948678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDFC8C70000000001030307) Dec 6 05:25:01 localhost nova_compute[237281]: 2025-12-06 10:25:01.187 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:25:01 localhost nova_compute[237281]: 2025-12-06 10:25:01.189 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:25:01 localhost nova_compute[237281]: 2025-12-06 10:25:01.189 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:25:01 localhost nova_compute[237281]: 2025-12-06 10:25:01.190 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:25:01 localhost nova_compute[237281]: 2025-12-06 10:25:01.391 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:02 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:02.098 2 INFO neutron.agent.securitygroups_rpc [None req-2eb41dbf-c865-4664-944e-40e716b3300b 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:02 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:02.463 2 INFO neutron.agent.securitygroups_rpc [None req-0ad5f16d-56d6-4ff7-95e2-e3e6185aad9c 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:25:04 localhost nova_compute[237281]: 2025-12-06 10:25:04.246 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:25:05 localhost podman[260206]: 2025-12-06 10:25:05.560840993 +0000 UTC m=+0.092944904 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:05 localhost podman[260206]: 2025-12-06 10:25:05.609325427 +0000 UTC m=+0.141403917 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 6 05:25:05 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:25:06 localhost nova_compute[237281]: 2025-12-06 10:25:06.395 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:06.709 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:25:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:06.710 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:25:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:06.711 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:25:09 localhost nova_compute[237281]: 2025-12-06 10:25:09.247 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:25:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:25:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57186 DF PROTO=TCP SPT=47484 DPT=9102 SEQ=3176948678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DDFE9880000000001030307) Dec 6 05:25:09 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:09.556 2 INFO neutron.agent.securitygroups_rpc [None req-3824ff56-56f1-4079-b90b-ed840f9a1121 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:09 localhost podman[260231]: 2025-12-06 10:25:09.56834172 +0000 UTC m=+0.095359418 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:25:09 localhost podman[260231]: 2025-12-06 10:25:09.574160409 +0000 UTC m=+0.101178107 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:25:09 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:25:09 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:09.592 219384 INFO neutron.agent.linux.ip_lib [None req-86031fb1-fb9b-47a8-aba6-27f76f2509fd - - - - - -] Device tapcbf4fe17-83 cannot be used as it has no MAC address#033[00m Dec 6 05:25:09 localhost nova_compute[237281]: 2025-12-06 10:25:09.616 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:09 localhost kernel: device tapcbf4fe17-83 entered promiscuous mode Dec 6 05:25:09 localhost NetworkManager[5965]: [1765016709.6243] manager: (tapcbf4fe17-83): new Generic device (/org/freedesktop/NetworkManager/Devices/65) Dec 6 05:25:09 localhost nova_compute[237281]: 2025-12-06 10:25:09.624 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:09 localhost ovn_controller[131684]: 2025-12-06T10:25:09Z|00375|binding|INFO|Claiming lport cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de for this chassis. Dec 6 05:25:09 localhost ovn_controller[131684]: 2025-12-06T10:25:09Z|00376|binding|INFO|cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de: Claiming unknown Dec 6 05:25:09 localhost systemd-udevd[260276]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:25:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:09.644 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f44bdb12-81c1-44f8-a5db-842c2a85fd29', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f44bdb12-81c1-44f8-a5db-842c2a85fd29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f53d29120b44434860c4dafb30d2afc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d351122d-d4c7-44a8-9afa-63d8054d673b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:09.646 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de in datapath f44bdb12-81c1-44f8-a5db-842c2a85fd29 bound to our chassis#033[00m Dec 6 05:25:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:09.648 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f44bdb12-81c1-44f8-a5db-842c2a85fd29 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:25:09 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:09.649 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6982ecab-e430-49a3-b796-21f8ccf27eb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost ovn_controller[131684]: 2025-12-06T10:25:09Z|00377|binding|INFO|Setting lport cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de ovn-installed in OVS Dec 6 05:25:09 localhost ovn_controller[131684]: 2025-12-06T10:25:09Z|00378|binding|INFO|Setting lport cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de up in Southbound Dec 6 05:25:09 localhost nova_compute[237281]: 2025-12-06 10:25:09.660 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost podman[260232]: 2025-12-06 10:25:09.690928845 +0000 UTC m=+0.214420565 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:25:09 localhost journal[186952]: ethtool ioctl error on tapcbf4fe17-83: No such device Dec 6 05:25:09 localhost podman[260232]: 2025-12-06 10:25:09.70018516 +0000 UTC m=+0.223676880 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:25:09 localhost nova_compute[237281]: 2025-12-06 10:25:09.706 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:09 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:25:09 localhost nova_compute[237281]: 2025-12-06 10:25:09.791 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:10 localhost podman[260355]: Dec 6 05:25:10 localhost podman[260355]: 2025-12-06 10:25:10.640007866 +0000 UTC m=+0.093182181 container create d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:25:10 localhost podman[260355]: 2025-12-06 10:25:10.592966717 +0000 UTC m=+0.046141062 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:10 localhost systemd[1]: Started libpod-conmon-d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af.scope. Dec 6 05:25:10 localhost systemd[1]: Started libcrun container. Dec 6 05:25:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/771fa5902e276fee2a7e5609cce30eb685cb89372f6f3376df19e2a14813ed38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:10 localhost podman[260355]: 2025-12-06 10:25:10.729494952 +0000 UTC m=+0.182669267 container init d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:25:10 localhost podman[260355]: 2025-12-06 10:25:10.739770188 +0000 UTC m=+0.192944503 container start d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.740 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:25:10 localhost dnsmasq[260374]: started, version 2.85 cachesize 150 Dec 6 05:25:10 localhost dnsmasq[260374]: DNS service limited to local subnets Dec 6 05:25:10 localhost dnsmasq[260374]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:10 localhost dnsmasq[260374]: warning: no upstream servers configured Dec 6 05:25:10 localhost dnsmasq-dhcp[260374]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:25:10 localhost dnsmasq[260374]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/addn_hosts - 0 addresses Dec 6 05:25:10 localhost dnsmasq-dhcp[260374]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/host Dec 6 05:25:10 localhost dnsmasq-dhcp[260374]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/opts Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.781 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.782 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.783 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.783 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.810 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.811 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.811 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.812 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.898 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:25:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:10.936 219384 INFO neutron.agent.dhcp.agent [None req-17734ec1-0615-41b4-9f8a-89d4972d2a34 - - - - - -] DHCP configuration for ports {'4bd798ac-b853-49a9-b1c3-d20c04e752f1'} is completed#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.977 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:25:10 localhost nova_compute[237281]: 2025-12-06 10:25:10.980 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.061 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.062 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.121 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.122 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:25:11 localhost dnsmasq[260374]: exiting on receipt of SIGTERM Dec 6 05:25:11 localhost podman[260396]: 2025-12-06 10:25:11.135989731 +0000 UTC m=+0.066880070 container kill d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:25:11 localhost systemd[1]: libpod-d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af.scope: Deactivated successfully. Dec 6 05:25:11 localhost podman[260420]: 2025-12-06 10:25:11.197816715 +0000 UTC m=+0.041045824 container died d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.206 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:25:11 localhost podman[260420]: 2025-12-06 10:25:11.303774589 +0000 UTC m=+0.147003648 container remove d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.305 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:11 localhost systemd[1]: libpod-conmon-d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af.scope: Deactivated successfully. Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.397 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.454 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.455 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12302MB free_disk=387.26609802246094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.455 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.456 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.585 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.586 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.586 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:25:11 localhost systemd[1]: var-lib-containers-storage-overlay-771fa5902e276fee2a7e5609cce30eb685cb89372f6f3376df19e2a14813ed38-merged.mount: Deactivated successfully. Dec 6 05:25:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5bbbb76f048bd5b19ab76105097131ae261d3b574a5b28730b25b3b1a55c3af-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.652 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.671 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.675 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:25:11 localhost nova_compute[237281]: 2025-12-06 10:25:11.676 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:25:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:12.418 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:12.420 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:25:12 localhost nova_compute[237281]: 2025-12-06 10:25:12.448 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:12 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:12.627 2 INFO neutron.agent.securitygroups_rpc [None req-74ff0aa1-3352-4807-b827-3f0acbbf4450 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:12 localhost podman[260495]: Dec 6 05:25:12 localhost podman[260495]: 2025-12-06 10:25:12.988964081 +0000 UTC m=+0.080851691 container create f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 6 05:25:13 localhost systemd[1]: Started libpod-conmon-f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660.scope. Dec 6 05:25:13 localhost systemd[1]: Started libcrun container. Dec 6 05:25:13 localhost podman[260495]: 2025-12-06 10:25:12.951944541 +0000 UTC m=+0.043832161 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c6294039cb6d6d67f95e65a251717bbbae50ef46b22e9ec6fb2da97270a688d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:13 localhost podman[260495]: 2025-12-06 10:25:13.062029722 +0000 UTC m=+0.153917342 container init f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:13 localhost podman[260495]: 2025-12-06 10:25:13.071114471 +0000 UTC m=+0.163002081 container start f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:25:13 localhost dnsmasq[260513]: started, version 2.85 cachesize 150 Dec 6 05:25:13 localhost dnsmasq[260513]: DNS service limited to local subnets Dec 6 05:25:13 localhost dnsmasq[260513]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:13 localhost dnsmasq[260513]: warning: no upstream servers configured Dec 6 05:25:13 localhost dnsmasq-dhcp[260513]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:13 localhost dnsmasq-dhcp[260513]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:25:13 localhost dnsmasq[260513]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/addn_hosts - 0 addresses Dec 6 05:25:13 localhost dnsmasq-dhcp[260513]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/host Dec 6 05:25:13 localhost dnsmasq-dhcp[260513]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/opts Dec 6 05:25:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:13.296 219384 INFO neutron.agent.dhcp.agent [None req-7fd17ad7-495a-455b-93d0-3c2e7987b82f - - - - - -] DHCP configuration for ports {'cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de', '4bd798ac-b853-49a9-b1c3-d20c04e752f1'} is completed#033[00m Dec 6 05:25:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:25:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:25:13 localhost podman[260514]: 2025-12-06 10:25:13.546587116 +0000 UTC m=+0.077607532 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:25:13 localhost podman[260515]: 2025-12-06 10:25:13.557084569 +0000 UTC m=+0.081906844 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm) Dec 6 05:25:13 localhost podman[260514]: 2025-12-06 10:25:13.582423279 +0000 UTC m=+0.113443645 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, managed_by=edpm_ansible) Dec 6 05:25:13 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:25:13 localhost podman[260515]: 2025-12-06 10:25:13.59739076 +0000 UTC m=+0.122212975 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:25:13 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:25:14 localhost nova_compute[237281]: 2025-12-06 10:25:14.249 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:16 localhost openstack_network_exporter[199751]: ERROR 10:25:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:25:16 localhost openstack_network_exporter[199751]: ERROR 10:25:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:25:16 localhost openstack_network_exporter[199751]: ERROR 10:25:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:25:16 localhost openstack_network_exporter[199751]: ERROR 10:25:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:25:16 localhost openstack_network_exporter[199751]: Dec 6 05:25:16 localhost openstack_network_exporter[199751]: ERROR 10:25:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:25:16 localhost openstack_network_exporter[199751]: Dec 6 05:25:16 localhost nova_compute[237281]: 2025-12-06 10:25:16.419 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:17 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:17.138 219384 INFO neutron.agent.linux.ip_lib [None req-76bd98aa-ba37-4ff3-8e6d-3e521917c6fe - - - - - -] Device tap8c8edf47-77 cannot be used as it has no MAC address#033[00m Dec 6 05:25:17 localhost nova_compute[237281]: 2025-12-06 10:25:17.163 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:17 localhost kernel: device tap8c8edf47-77 entered promiscuous mode Dec 6 05:25:17 localhost NetworkManager[5965]: [1765016717.1714] manager: (tap8c8edf47-77): new Generic device (/org/freedesktop/NetworkManager/Devices/66) Dec 6 05:25:17 localhost ovn_controller[131684]: 2025-12-06T10:25:17Z|00379|binding|INFO|Claiming lport 8c8edf47-776a-43a7-a0d9-5ec1830d973f for this chassis. Dec 6 05:25:17 localhost ovn_controller[131684]: 2025-12-06T10:25:17Z|00380|binding|INFO|8c8edf47-776a-43a7-a0d9-5ec1830d973f: Claiming unknown Dec 6 05:25:17 localhost nova_compute[237281]: 2025-12-06 10:25:17.174 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:17 localhost systemd-udevd[260563]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:25:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:17.184 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97c05b08-01e2-4e43-90f9-5f13487bccae, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c8edf47-776a-43a7-a0d9-5ec1830d973f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:17.186 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 8c8edf47-776a-43a7-a0d9-5ec1830d973f in datapath b2b5c9fc-9760-4e26-8e53-de2c63f0e07c bound to our chassis#033[00m Dec 6 05:25:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:17.189 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6013e3e7-39ae-40dd-8bf9-9cf6091585b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:25:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:17.189 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:17.189 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[17a7a780-b683-4a8f-abd4-ff930fe2bed0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost ovn_controller[131684]: 2025-12-06T10:25:17Z|00381|binding|INFO|Setting lport 8c8edf47-776a-43a7-a0d9-5ec1830d973f ovn-installed in OVS Dec 6 05:25:17 localhost ovn_controller[131684]: 2025-12-06T10:25:17Z|00382|binding|INFO|Setting lport 8c8edf47-776a-43a7-a0d9-5ec1830d973f up in Southbound Dec 6 05:25:17 localhost nova_compute[237281]: 2025-12-06 10:25:17.208 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:17 localhost nova_compute[237281]: 2025-12-06 10:25:17.209 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost journal[186952]: ethtool ioctl error on tap8c8edf47-77: No such device Dec 6 05:25:17 localhost nova_compute[237281]: 2025-12-06 10:25:17.260 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:17 localhost nova_compute[237281]: 2025-12-06 10:25:17.290 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:17 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:17.421 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:25:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:18.110 219384 INFO neutron.agent.linux.ip_lib [None req-59394ae1-1c5c-4890-9a57-50897a13a52f - - - - - -] Device tap7c5159c1-bd cannot be used as it has no MAC address#033[00m Dec 6 05:25:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:18.149 2 INFO neutron.agent.securitygroups_rpc [None req-198b5302-0f0f-4885-859d-81e3a78ea36a 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:18 localhost nova_compute[237281]: 2025-12-06 10:25:18.162 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:18 localhost kernel: device tap7c5159c1-bd entered promiscuous mode Dec 6 05:25:18 localhost systemd-udevd[260565]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:25:18 localhost NetworkManager[5965]: [1765016718.1723] manager: (tap7c5159c1-bd): new Generic device (/org/freedesktop/NetworkManager/Devices/67) Dec 6 05:25:18 localhost ovn_controller[131684]: 2025-12-06T10:25:18Z|00383|binding|INFO|Claiming lport 7c5159c1-bdcd-437d-bee5-b12f4e51cd5f for this chassis. Dec 6 05:25:18 localhost ovn_controller[131684]: 2025-12-06T10:25:18Z|00384|binding|INFO|7c5159c1-bdcd-437d-bee5-b12f4e51cd5f: Claiming unknown Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.187 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6d270b4-86bf-4d3c-9534-fc16b2336e09, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=7c5159c1-bdcd-437d-bee5-b12f4e51cd5f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:18 localhost nova_compute[237281]: 2025-12-06 10:25:18.185 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.189 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 7c5159c1-bdcd-437d-bee5-b12f4e51cd5f in datapath 9a9f536a-4201-4d67-a433-6077de86991e bound to our chassis#033[00m Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.193 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4372fc8b-381e-4e0c-a172-ae8fb0ab8c0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.193 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a9f536a-4201-4d67-a433-6077de86991e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.194 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[83b18a2c-7fd2-43b3-9381-fc9fb9bf7599]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:18 localhost ovn_controller[131684]: 2025-12-06T10:25:18Z|00385|binding|INFO|Setting lport 7c5159c1-bdcd-437d-bee5-b12f4e51cd5f ovn-installed in OVS Dec 6 05:25:18 localhost ovn_controller[131684]: 2025-12-06T10:25:18Z|00386|binding|INFO|Setting lport 7c5159c1-bdcd-437d-bee5-b12f4e51cd5f up in Southbound Dec 6 05:25:18 localhost nova_compute[237281]: 2025-12-06 10:25:18.220 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:18 localhost nova_compute[237281]: 2025-12-06 10:25:18.250 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:18 localhost podman[260642]: Dec 6 05:25:18 localhost nova_compute[237281]: 2025-12-06 10:25:18.277 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:18 localhost podman[260642]: 2025-12-06 10:25:18.282709234 +0000 UTC m=+0.127981013 container create 1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:18 localhost systemd[1]: Started libpod-conmon-1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d.scope. Dec 6 05:25:18 localhost podman[260642]: 2025-12-06 10:25:18.234536809 +0000 UTC m=+0.079808658 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:18 localhost systemd[1]: Started libcrun container. Dec 6 05:25:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1111130f339dd6d1501406b66e19da6b259c09f90b7d0f487505e8a05468c7aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:18 localhost podman[260642]: 2025-12-06 10:25:18.36247742 +0000 UTC m=+0.207749209 container init 1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true) Dec 6 05:25:18 localhost podman[260642]: 2025-12-06 10:25:18.37090338 +0000 UTC m=+0.216175169 container start 1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:18 localhost dnsmasq[260671]: started, version 2.85 cachesize 150 Dec 6 05:25:18 localhost dnsmasq[260671]: DNS service limited to local subnets Dec 6 05:25:18 localhost dnsmasq[260671]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:18 localhost dnsmasq[260671]: warning: no upstream servers configured Dec 6 05:25:18 localhost dnsmasq-dhcp[260671]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:18 localhost dnsmasq[260671]: read /var/lib/neutron/dhcp/b2b5c9fc-9760-4e26-8e53-de2c63f0e07c/addn_hosts - 0 addresses Dec 6 05:25:18 localhost dnsmasq-dhcp[260671]: read /var/lib/neutron/dhcp/b2b5c9fc-9760-4e26-8e53-de2c63f0e07c/host Dec 6 05:25:18 localhost dnsmasq-dhcp[260671]: read /var/lib/neutron/dhcp/b2b5c9fc-9760-4e26-8e53-de2c63f0e07c/opts Dec 6 05:25:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:18.496 219384 INFO neutron.agent.dhcp.agent [None req-90bdcfa3-7189-4e41-8f57-cc3647be0881 - - - - - -] DHCP configuration for ports {'360245db-607f-4c32-8f39-c8ea40b1ffb2'} is completed#033[00m Dec 6 05:25:18 localhost ovn_controller[131684]: 2025-12-06T10:25:18Z|00387|binding|INFO|Releasing lport 8c8edf47-776a-43a7-a0d9-5ec1830d973f from this chassis (sb_readonly=0) Dec 6 05:25:18 localhost nova_compute[237281]: 2025-12-06 10:25:18.598 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:18 localhost kernel: device tap8c8edf47-77 left promiscuous mode Dec 6 05:25:18 localhost ovn_controller[131684]: 2025-12-06T10:25:18Z|00388|binding|INFO|Setting lport 8c8edf47-776a-43a7-a0d9-5ec1830d973f down in Southbound Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.611 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97c05b08-01e2-4e43-90f9-5f13487bccae, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c8edf47-776a-43a7-a0d9-5ec1830d973f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.613 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 8c8edf47-776a-43a7-a0d9-5ec1830d973f in datapath b2b5c9fc-9760-4e26-8e53-de2c63f0e07c unbound from our chassis#033[00m Dec 6 05:25:18 localhost nova_compute[237281]: 2025-12-06 10:25:18.618 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.619 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:18.620 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3aed79-3fce-46e3-a3b1-3451716ea9d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:18 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:18.649 2 INFO neutron.agent.securitygroups_rpc [None req-165ba62f-85b4-4b5a-afb7-b3f79bc3e47c 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['45a5f0d1-cf21-491e-b24d-588f351f8a2f']#033[00m Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.140 219384 INFO neutron.agent.linux.ip_lib [None req-f85e5f34-8407-4390-9089-1c3c2a21afb1 - - - - - -] Device tap90d31905-58 cannot be used as it has no MAC address#033[00m Dec 6 05:25:19 localhost nova_compute[237281]: 2025-12-06 10:25:19.167 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:19 localhost kernel: device tap90d31905-58 entered promiscuous mode Dec 6 05:25:19 localhost NetworkManager[5965]: [1765016719.1741] manager: (tap90d31905-58): new Generic device (/org/freedesktop/NetworkManager/Devices/68) Dec 6 05:25:19 localhost ovn_controller[131684]: 2025-12-06T10:25:19Z|00389|binding|INFO|Claiming lport 90d31905-5871-4131-90ad-d765aa444f8c for this chassis. Dec 6 05:25:19 localhost ovn_controller[131684]: 2025-12-06T10:25:19Z|00390|binding|INFO|90d31905-5871-4131-90ad-d765aa444f8c: Claiming unknown Dec 6 05:25:19 localhost nova_compute[237281]: 2025-12-06 10:25:19.209 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:19.229 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/16', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c3165881009544ef8d4fad0a69ab4f02', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18be9024-5981-4e7a-b20d-7bf2208c05c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=90d31905-5871-4131-90ad-d765aa444f8c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:19.231 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 90d31905-5871-4131-90ad-d765aa444f8c in datapath 54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6 bound to our chassis#033[00m Dec 6 05:25:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:19.233 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:25:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:19.233 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[8d905390-1795-43c3-9c12-3467d54ffff0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:19 localhost ovn_controller[131684]: 2025-12-06T10:25:19Z|00391|binding|INFO|Setting lport 90d31905-5871-4131-90ad-d765aa444f8c ovn-installed in OVS Dec 6 05:25:19 localhost ovn_controller[131684]: 2025-12-06T10:25:19Z|00392|binding|INFO|Setting lport 90d31905-5871-4131-90ad-d765aa444f8c up in Southbound Dec 6 05:25:19 localhost nova_compute[237281]: 2025-12-06 10:25:19.237 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:19 localhost nova_compute[237281]: 2025-12-06 10:25:19.251 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:19 localhost nova_compute[237281]: 2025-12-06 10:25:19.252 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:19 localhost podman[260723]: Dec 6 05:25:19 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:19.280 2 INFO neutron.agent.securitygroups_rpc [None req-cf8c59c3-eceb-4c4b-aa5b-5bad663c66d0 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:19 localhost podman[260723]: 2025-12-06 10:25:19.284027793 +0000 UTC m=+0.107196492 container create c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:25:19 localhost nova_compute[237281]: 2025-12-06 10:25:19.312 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:19 localhost systemd[1]: Started libpod-conmon-c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c.scope. Dec 6 05:25:19 localhost podman[260723]: 2025-12-06 10:25:19.229707251 +0000 UTC m=+0.052876000 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:19 localhost nova_compute[237281]: 2025-12-06 10:25:19.346 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:19 localhost systemd[1]: Started libcrun container. Dec 6 05:25:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a583911b044e0b2e788e40d7f54d52bb5f5c73c301242378d54fea2dbb513aea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:19 localhost podman[260723]: 2025-12-06 10:25:19.389675367 +0000 UTC m=+0.212844066 container init c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:19 localhost dnsmasq[260752]: started, version 2.85 cachesize 150 Dec 6 05:25:19 localhost podman[260723]: 2025-12-06 10:25:19.402265325 +0000 UTC m=+0.225434024 container start c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:19 localhost dnsmasq[260752]: DNS service limited to local subnets Dec 6 05:25:19 localhost dnsmasq[260752]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:19 localhost dnsmasq[260752]: warning: no upstream servers configured Dec 6 05:25:19 localhost dnsmasq-dhcp[260752]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:19 localhost dnsmasq[260752]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 0 addresses Dec 6 05:25:19 localhost dnsmasq-dhcp[260752]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:19 localhost dnsmasq-dhcp[260752]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.465 219384 INFO neutron.agent.dhcp.agent [None req-315cd43d-b596-4be7-ac98-ba09854e67f8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:25:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b00b2743-5b26-4292-bc21-d89c0fd85077, ip_allocation=immediate, mac_address=fa:16:3e:92:6f:6d, name=tempest-PortsTestJSON-707244245, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:26Z, description=, dns_domain=, id=9a9f536a-4201-4d67-a433-6077de86991e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-306893013, port_security_enabled=True, project_id=3f8ef38a4bec46d18248142804d6d2a3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42595, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1621, status=ACTIVE, subnets=['94fe7d43-91b4-46da-b913-9c0a6123d1f4'], tags=[], tenant_id=3f8ef38a4bec46d18248142804d6d2a3, updated_at=2025-12-06T10:25:16Z, vlan_transparent=None, network_id=9a9f536a-4201-4d67-a433-6077de86991e, port_security_enabled=True, project_id=3f8ef38a4bec46d18248142804d6d2a3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['45a5f0d1-cf21-491e-b24d-588f351f8a2f'], standard_attr_id=2153, status=DOWN, tags=[], tenant_id=3f8ef38a4bec46d18248142804d6d2a3, updated_at=2025-12-06T10:25:18Z on network 9a9f536a-4201-4d67-a433-6077de86991e#033[00m Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.600 219384 INFO neutron.agent.dhcp.agent [None req-843e6798-b970-4f67-8459-184bd506bfcc - - - - - -] DHCP configuration for ports {'f586d69b-90ba-43e8-b402-069c222110c3', '26d09c5d-5602-417f-baf9-28f7a2e5060c'} is completed#033[00m Dec 6 05:25:19 localhost dnsmasq[260752]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 1 addresses Dec 6 05:25:19 localhost dnsmasq-dhcp[260752]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:19 localhost dnsmasq-dhcp[260752]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:19 localhost podman[260780]: 2025-12-06 10:25:19.656778374 +0000 UTC m=+0.047734842 container kill c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:25:19 localhost dnsmasq[260671]: read /var/lib/neutron/dhcp/b2b5c9fc-9760-4e26-8e53-de2c63f0e07c/addn_hosts - 0 addresses Dec 6 05:25:19 localhost dnsmasq-dhcp[260671]: read /var/lib/neutron/dhcp/b2b5c9fc-9760-4e26-8e53-de2c63f0e07c/host Dec 6 05:25:19 localhost podman[260824]: 2025-12-06 10:25:19.863365966 +0000 UTC m=+0.065249701 container kill 1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:25:19 localhost dnsmasq-dhcp[260671]: read /var/lib/neutron/dhcp/b2b5c9fc-9760-4e26-8e53-de2c63f0e07c/opts Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent [None req-f9e9b291-321f-4cd2-a921-ceb12e8f453f - - - - - -] Unable to reload_allocations dhcp for b2b5c9fc-9760-4e26-8e53-de2c63f0e07c.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap8c8edf47-77 not found in namespace qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c. Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent return fut.result() Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent raise self._exception Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap8c8edf47-77 not found in namespace qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c. Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.889 219384 ERROR neutron.agent.dhcp.agent #033[00m Dec 6 05:25:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:19.994 219384 INFO neutron.agent.dhcp.agent [None req-a812ea2c-9e10-4ba5-be9e-d902eafc1118 - - - - - -] DHCP configuration for ports {'b00b2743-5b26-4292-bc21-d89c0fd85077'} is completed#033[00m Dec 6 05:25:20 localhost systemd[1]: tmp-crun.fpsxuY.mount: Deactivated successfully. Dec 6 05:25:20 localhost podman[260865]: Dec 6 05:25:20 localhost podman[260865]: 2025-12-06 10:25:20.393798673 +0000 UTC m=+0.095760230 container create 0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:25:20 localhost systemd[1]: Started libpod-conmon-0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655.scope. Dec 6 05:25:20 localhost podman[260865]: 2025-12-06 10:25:20.349927711 +0000 UTC m=+0.051889288 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:20 localhost systemd[1]: Started libcrun container. Dec 6 05:25:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66243641d3b5d8aa6169201bf046b527b1cafd96df10b7358361c114609468f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:20 localhost podman[260865]: 2025-12-06 10:25:20.469898346 +0000 UTC m=+0.171859913 container init 0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:20 localhost podman[260865]: 2025-12-06 10:25:20.479488422 +0000 UTC m=+0.181449989 container start 0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:20 localhost dnsmasq[260883]: started, version 2.85 cachesize 150 Dec 6 05:25:20 localhost dnsmasq[260883]: DNS service limited to local subnets Dec 6 05:25:20 localhost dnsmasq[260883]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:20 localhost dnsmasq[260883]: warning: no upstream servers configured Dec 6 05:25:20 localhost dnsmasq-dhcp[260883]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:20 localhost dnsmasq[260883]: read /var/lib/neutron/dhcp/54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6/addn_hosts - 0 addresses Dec 6 05:25:20 localhost dnsmasq-dhcp[260883]: read /var/lib/neutron/dhcp/54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6/host Dec 6 05:25:20 localhost dnsmasq-dhcp[260883]: read /var/lib/neutron/dhcp/54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6/opts Dec 6 05:25:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:20.544 219384 INFO neutron.agent.dhcp.agent [None req-28bcf640-bdff-4d9b-8a76-9ce33919655f - - - - - -] Resizing dhcp processing queue green pool size to: 9#033[00m Dec 6 05:25:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:20.547 219384 INFO neutron.agent.dhcp.agent [None req-59568fd1-48e4-42d1-aa5b-602d7666bcc4 - - - - - -] Synchronizing state#033[00m Dec 6 05:25:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:20.708 219384 INFO neutron.agent.dhcp.agent [None req-0a68c474-dc73-446f-9a62-56b7b7681822 - - - - - -] DHCP configuration for ports {'3b685be2-df07-4e15-8d41-8332751d714a'} is completed#033[00m Dec 6 05:25:20 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:20.950 2 INFO neutron.agent.securitygroups_rpc [None req-752b535a-5d0b-4a7b-a058-90725181779f 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:20.953 219384 INFO neutron.agent.dhcp.agent [None req-e0a05e59-4e89-465b-b8a9-354d25878467 - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 6 05:25:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:20.954 219384 INFO neutron.agent.dhcp.agent [-] Starting network b2b5c9fc-9760-4e26-8e53-de2c63f0e07c dhcp configuration#033[00m Dec 6 05:25:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:20.955 219384 INFO neutron.agent.dhcp.agent [-] Finished network b2b5c9fc-9760-4e26-8e53-de2c63f0e07c dhcp configuration#033[00m Dec 6 05:25:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:20.955 219384 INFO neutron.agent.dhcp.agent [None req-e0a05e59-4e89-465b-b8a9-354d25878467 - - - - - -] Synchronizing state complete#033[00m Dec 6 05:25:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:21.022 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:cd:f0 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6d270b4-86bf-4d3c-9534-fc16b2336e09, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f586d69b-90ba-43e8-b402-069c222110c3) old=Port_Binding(mac=['fa:16:3e:c6:cd:f0 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:21.025 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f586d69b-90ba-43e8-b402-069c222110c3 in datapath 9a9f536a-4201-4d67-a433-6077de86991e updated#033[00m Dec 6 05:25:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:21.030 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4372fc8b-381e-4e0c-a172-ae8fb0ab8c0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:25:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:21.030 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a9f536a-4201-4d67-a433-6077de86991e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:21 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:21.031 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5a93ca-3304-43c5-82b9-5e6bdf3327b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:21 localhost dnsmasq[260671]: exiting on receipt of SIGTERM Dec 6 05:25:21 localhost podman[260913]: 2025-12-06 10:25:21.227735748 +0000 UTC m=+0.053472788 container kill 1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:25:21 localhost systemd[1]: libpod-1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d.scope: Deactivated successfully. Dec 6 05:25:21 localhost podman[260927]: 2025-12-06 10:25:21.245150904 +0000 UTC m=+0.045410150 container kill c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:21 localhost systemd[1]: libpod-c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c.scope: Deactivated successfully. Dec 6 05:25:21 localhost dnsmasq[260752]: exiting on receipt of SIGTERM Dec 6 05:25:21 localhost podman[260956]: 2025-12-06 10:25:21.289698226 +0000 UTC m=+0.032585735 container died c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:25:21 localhost podman[260943]: 2025-12-06 10:25:21.304706888 +0000 UTC m=+0.054086857 container died 1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:21 localhost podman[260943]: 2025-12-06 10:25:21.349690833 +0000 UTC m=+0.099070782 container remove 1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2b5c9fc-9760-4e26-8e53-de2c63f0e07c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:25:21 localhost systemd[1]: libpod-conmon-1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d.scope: Deactivated successfully. Dec 6 05:25:21 localhost podman[260956]: 2025-12-06 10:25:21.36839657 +0000 UTC m=+0.111284099 container cleanup c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:25:21 localhost systemd[1]: libpod-conmon-c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c.scope: Deactivated successfully. Dec 6 05:25:21 localhost podman[260957]: 2025-12-06 10:25:21.398520328 +0000 UTC m=+0.120917305 container remove c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 05:25:21 localhost nova_compute[237281]: 2025-12-06 10:25:21.421 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:22 localhost systemd[1]: tmp-crun.c1oNHu.mount: Deactivated successfully. Dec 6 05:25:22 localhost systemd[1]: var-lib-containers-storage-overlay-a583911b044e0b2e788e40d7f54d52bb5f5c73c301242378d54fea2dbb513aea-merged.mount: Deactivated successfully. Dec 6 05:25:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c12a252ddf68c68029b4deefda4f90adde9db30d572915cffee7be594452ed0c-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:22 localhost systemd[1]: var-lib-containers-storage-overlay-1111130f339dd6d1501406b66e19da6b259c09f90b7d0f487505e8a05468c7aa-merged.mount: Deactivated successfully. Dec 6 05:25:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d1b92b600bd0f4d8330e953343a8e84a3ed97c6e311f921806315bf2e15e90d-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:23 localhost podman[197801]: time="2025-12-06T10:25:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:25:23 localhost podman[197801]: @ - - [06/Dec/2025:10:25:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155186 "" "Go-http-client/1.1" Dec 6 05:25:23 localhost podman[197801]: @ - - [06/Dec/2025:10:25:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18826 "" "Go-http-client/1.1" Dec 6 05:25:23 localhost ovn_controller[131684]: 2025-12-06T10:25:23Z|00393|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:25:23 localhost systemd[1]: run-netns-qdhcp\x2db2b5c9fc\x2d9760\x2d4e26\x2d8e53\x2dde2c63f0e07c.mount: Deactivated successfully. Dec 6 05:25:23 localhost nova_compute[237281]: 2025-12-06 10:25:23.580 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:23 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:23.581 219384 INFO neutron.agent.dhcp.agent [None req-91473d8c-c755-478c-8c80-a3ce661a4d01 - - - - - -] Resizing dhcp processing queue green pool size to: 8#033[00m Dec 6 05:25:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42666 DF PROTO=TCP SPT=54696 DPT=9102 SEQ=1458137816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE0222A0000000001030307) Dec 6 05:25:24 localhost nova_compute[237281]: 2025-12-06 10:25:24.252 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:25:24 localhost podman[260990]: 2025-12-06 10:25:24.538727443 +0000 UTC m=+0.072566996 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 6 05:25:24 localhost podman[260990]: 2025-12-06 10:25:24.575887167 +0000 UTC m=+0.109726710 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:25:24 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:25:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42667 DF PROTO=TCP SPT=54696 DPT=9102 SEQ=1458137816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE026470000000001030307) Dec 6 05:25:25 localhost podman[261059]: Dec 6 05:25:25 localhost podman[261059]: 2025-12-06 10:25:25.486531685 +0000 UTC m=+0.089697194 container create f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:25:25 localhost systemd[1]: Started libpod-conmon-f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3.scope. Dec 6 05:25:25 localhost podman[261059]: 2025-12-06 10:25:25.443796549 +0000 UTC m=+0.046962108 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:25 localhost systemd[1]: Started libcrun container. Dec 6 05:25:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dec99e4bbbbe7b23a8f5c79268503e62a27f9c7f350cf1242f79970c90c6b795/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:25 localhost podman[261059]: 2025-12-06 10:25:25.578494678 +0000 UTC m=+0.181660217 container init f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:25 localhost podman[261059]: 2025-12-06 10:25:25.589123415 +0000 UTC m=+0.192288954 container start f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:25:25 localhost dnsmasq[261077]: started, version 2.85 cachesize 150 Dec 6 05:25:25 localhost dnsmasq[261077]: DNS service limited to local subnets Dec 6 05:25:25 localhost dnsmasq[261077]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:25 localhost dnsmasq[261077]: warning: no upstream servers configured Dec 6 05:25:25 localhost dnsmasq-dhcp[261077]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 6 05:25:25 localhost dnsmasq-dhcp[261077]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:25 localhost dnsmasq[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 1 addresses Dec 6 05:25:25 localhost dnsmasq-dhcp[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:25 localhost dnsmasq-dhcp[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:25 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:25.733 2 INFO neutron.agent.securitygroups_rpc [None req-1f0240fc-0099-413d-b05c-469cf1e197c6 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57187 DF PROTO=TCP SPT=47484 DPT=9102 SEQ=3176948678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE029870000000001030307) Dec 6 05:25:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:26.078 219384 INFO neutron.agent.dhcp.agent [None req-98cb1161-0605-43c8-8a6f-2d3d4efe72de - - - - - -] DHCP configuration for ports {'b00b2743-5b26-4292-bc21-d89c0fd85077', 'f586d69b-90ba-43e8-b402-069c222110c3', '26d09c5d-5602-417f-baf9-28f7a2e5060c', '7c5159c1-bdcd-437d-bee5-b12f4e51cd5f'} is completed#033[00m Dec 6 05:25:26 localhost nova_compute[237281]: 2025-12-06 10:25:26.423 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:26 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:26.511 2 INFO neutron.agent.securitygroups_rpc [None req-9686b872-8926-433a-9382-9db1f89c168a 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['45a5f0d1-cf21-491e-b24d-588f351f8a2f', 'a295a40c-f046-495e-a64e-4a70be3bd1ad']#033[00m Dec 6 05:25:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:26.628 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:25:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b00b2743-5b26-4292-bc21-d89c0fd85077, ip_allocation=immediate, mac_address=fa:16:3e:92:6f:6d, name=tempest-PortsTestJSON-1273673093, network_id=9a9f536a-4201-4d67-a433-6077de86991e, port_security_enabled=True, project_id=3f8ef38a4bec46d18248142804d6d2a3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['a295a40c-f046-495e-a64e-4a70be3bd1ad'], standard_attr_id=2153, status=DOWN, tags=[], tenant_id=3f8ef38a4bec46d18248142804d6d2a3, updated_at=2025-12-06T10:25:25Z on network 9a9f536a-4201-4d67-a433-6077de86991e#033[00m Dec 6 05:25:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:26.633 219384 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpahi20kcz/privsep.sock']#033[00m Dec 6 05:25:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42668 DF PROTO=TCP SPT=54696 DPT=9102 SEQ=1458137816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE02E470000000001030307) Dec 6 05:25:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:27.291 219384 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Dec 6 05:25:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:27.142 261082 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 6 05:25:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:27.146 261082 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 6 05:25:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:27.148 261082 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Dec 6 05:25:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:27.148 261082 INFO oslo.privsep.daemon [-] privsep daemon running as pid 261082#033[00m Dec 6 05:25:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:25:27 localhost dnsmasq-dhcp[261077]: DHCPRELEASE(tap7c5159c1-bd) 10.100.0.8 fa:16:3e:92:6f:6d Dec 6 05:25:27 localhost podman[261086]: 2025-12-06 10:25:27.57016837 +0000 UTC m=+0.101721524 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:25:27 localhost podman[261086]: 2025-12-06 10:25:27.578031032 +0000 UTC m=+0.109584176 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:25:27 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:25:27 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:27.800 2 INFO neutron.agent.securitygroups_rpc [None req-43634988-f88f-4ced-8426-5c3074a11adc 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40257 DF PROTO=TCP SPT=49500 DPT=9102 SEQ=4247368960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE031870000000001030307) Dec 6 05:25:28 localhost dnsmasq[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 1 addresses Dec 6 05:25:28 localhost podman[261126]: 2025-12-06 10:25:28.297234262 +0000 UTC m=+0.069503831 container kill f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:28 localhost dnsmasq-dhcp[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:28 localhost dnsmasq-dhcp[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:28 localhost systemd[1]: tmp-crun.ruUYAV.mount: Deactivated successfully. Dec 6 05:25:28 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:28.522 219384 INFO neutron.agent.dhcp.agent [None req-1fdbfa9a-371e-425a-8a9a-48fcee65588e - - - - - -] DHCP configuration for ports {'b00b2743-5b26-4292-bc21-d89c0fd85077'} is completed#033[00m Dec 6 05:25:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:28.708 2 INFO neutron.agent.securitygroups_rpc [None req-45fb3bf6-b406-4e3c-a2c3-96981493df90 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['a295a40c-f046-495e-a64e-4a70be3bd1ad']#033[00m Dec 6 05:25:29 localhost nova_compute[237281]: 2025-12-06 10:25:29.254 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:29 localhost systemd[1]: tmp-crun.NCy3J5.mount: Deactivated successfully. Dec 6 05:25:29 localhost dnsmasq[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 0 addresses Dec 6 05:25:29 localhost dnsmasq-dhcp[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:29 localhost podman[261163]: 2025-12-06 10:25:29.602244805 +0000 UTC m=+0.075931839 container kill f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 05:25:29 localhost dnsmasq-dhcp[261077]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:30 localhost sshd[261185]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:25:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:30.867 219384 INFO neutron.agent.linux.ip_lib [None req-8e113978-9cb6-4bca-8bd5-fb87e7e43873 - - - - - -] Device tap00eda423-33 cannot be used as it has no MAC address#033[00m Dec 6 05:25:30 localhost nova_compute[237281]: 2025-12-06 10:25:30.941 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:30 localhost kernel: device tap00eda423-33 entered promiscuous mode Dec 6 05:25:30 localhost NetworkManager[5965]: [1765016730.9512] manager: (tap00eda423-33): new Generic device (/org/freedesktop/NetworkManager/Devices/69) Dec 6 05:25:30 localhost ovn_controller[131684]: 2025-12-06T10:25:30Z|00394|binding|INFO|Claiming lport 00eda423-330c-492b-86dc-f6b450d68ab8 for this chassis. Dec 6 05:25:30 localhost ovn_controller[131684]: 2025-12-06T10:25:30Z|00395|binding|INFO|00eda423-330c-492b-86dc-f6b450d68ab8: Claiming unknown Dec 6 05:25:30 localhost systemd-udevd[261202]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:25:30 localhost nova_compute[237281]: 2025-12-06 10:25:30.957 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:30.971 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-17b45f00-da82-4acf-91c9-457de8bc1cf3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17b45f00-da82-4acf-91c9-457de8bc1cf3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=affdf30b-af34-4151-b040-96ef747bdacf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=00eda423-330c-492b-86dc-f6b450d68ab8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:30.973 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 00eda423-330c-492b-86dc-f6b450d68ab8 in datapath 17b45f00-da82-4acf-91c9-457de8bc1cf3 bound to our chassis#033[00m Dec 6 05:25:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:30.976 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 17b45f00-da82-4acf-91c9-457de8bc1cf3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:25:30 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:30.977 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[3cfcf1fe-97b0-4751-a6a9-50ab1e4493aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:30 localhost ovn_controller[131684]: 2025-12-06T10:25:30Z|00396|binding|INFO|Setting lport 00eda423-330c-492b-86dc-f6b450d68ab8 ovn-installed in OVS Dec 6 05:25:30 localhost ovn_controller[131684]: 2025-12-06T10:25:30Z|00397|binding|INFO|Setting lport 00eda423-330c-492b-86dc-f6b450d68ab8 up in Southbound Dec 6 05:25:31 localhost nova_compute[237281]: 2025-12-06 10:25:30.998 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:31 localhost nova_compute[237281]: 2025-12-06 10:25:31.051 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:31 localhost nova_compute[237281]: 2025-12-06 10:25:31.081 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:31 localhost dnsmasq[260883]: exiting on receipt of SIGTERM Dec 6 05:25:31 localhost podman[261222]: 2025-12-06 10:25:31.125274444 +0000 UTC m=+0.068263963 container kill 0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:25:31 localhost systemd[1]: libpod-0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655.scope: Deactivated successfully. Dec 6 05:25:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42669 DF PROTO=TCP SPT=54696 DPT=9102 SEQ=1458137816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE03E080000000001030307) Dec 6 05:25:31 localhost podman[261240]: 2025-12-06 10:25:31.193293928 +0000 UTC m=+0.055949254 container died 0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:31 localhost podman[261240]: 2025-12-06 10:25:31.227612205 +0000 UTC m=+0.090267521 container cleanup 0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:25:31 localhost systemd[1]: libpod-conmon-0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655.scope: Deactivated successfully. Dec 6 05:25:31 localhost podman[261242]: 2025-12-06 10:25:31.271215839 +0000 UTC m=+0.123652640 container remove 0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:25:31 localhost ovn_controller[131684]: 2025-12-06T10:25:31Z|00398|binding|INFO|Releasing lport 90d31905-5871-4131-90ad-d765aa444f8c from this chassis (sb_readonly=0) Dec 6 05:25:31 localhost ovn_controller[131684]: 2025-12-06T10:25:31Z|00399|binding|INFO|Setting lport 90d31905-5871-4131-90ad-d765aa444f8c down in Southbound Dec 6 05:25:31 localhost kernel: device tap90d31905-58 left promiscuous mode Dec 6 05:25:31 localhost nova_compute[237281]: 2025-12-06 10:25:31.287 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:31.297 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/16', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c3165881009544ef8d4fad0a69ab4f02', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18be9024-5981-4e7a-b20d-7bf2208c05c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=90d31905-5871-4131-90ad-d765aa444f8c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:31.299 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 90d31905-5871-4131-90ad-d765aa444f8c in datapath 54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6 unbound from our chassis#033[00m Dec 6 05:25:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:31.304 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54b17b2c-c6c1-4dc7-8706-1dda1d75ebd6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:31.305 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b23d71-c7b1-4dcb-9a5b-459fe5ddad8b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:31 localhost nova_compute[237281]: 2025-12-06 10:25:31.309 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:31 localhost nova_compute[237281]: 2025-12-06 10:25:31.310 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:31 localhost nova_compute[237281]: 2025-12-06 10:25:31.426 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:31 localhost systemd[1]: var-lib-containers-storage-overlay-66243641d3b5d8aa6169201bf046b527b1cafd96df10b7358361c114609468f4-merged.mount: Deactivated successfully. Dec 6 05:25:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d6c946fd08b6631c22d8286efb69283eb0122b26fa9fdc22e8f7614a4aed655-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:31 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:31.829 2 INFO neutron.agent.securitygroups_rpc [None req-372b9a2d-eb3a-4fe8-b868-6aaeaa53a801 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:32 localhost systemd[1]: run-netns-qdhcp\x2d54b17b2c\x2dc6c1\x2d4dc7\x2d8706\x2d1dda1d75ebd6.mount: Deactivated successfully. Dec 6 05:25:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:32.134 219384 INFO neutron.agent.dhcp.agent [None req-efe16834-cde9-4078-92f5-479236173a3a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:25:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:32.135 219384 INFO neutron.agent.dhcp.agent [None req-efe16834-cde9-4078-92f5-479236173a3a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:25:32 localhost podman[261312]: Dec 6 05:25:32 localhost podman[261312]: 2025-12-06 10:25:32.326250013 +0000 UTC m=+0.086913698 container create 454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:25:32 localhost systemd[1]: Started libpod-conmon-454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2.scope. Dec 6 05:25:32 localhost systemd[1]: Started libcrun container. Dec 6 05:25:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fb9d0be06ed930003bcb03d86ebc83ac6987884b8e22d33c1938e87331769f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:32 localhost podman[261312]: 2025-12-06 10:25:32.284568079 +0000 UTC m=+0.045231764 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:32 localhost podman[261312]: 2025-12-06 10:25:32.390795831 +0000 UTC m=+0.151459516 container init 454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:25:32 localhost podman[261312]: 2025-12-06 10:25:32.400420997 +0000 UTC m=+0.161084692 container start 454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:25:32 localhost dnsmasq[261331]: started, version 2.85 cachesize 150 Dec 6 05:25:32 localhost dnsmasq[261331]: DNS service limited to local subnets Dec 6 05:25:32 localhost dnsmasq[261331]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:32 localhost dnsmasq[261331]: warning: no upstream servers configured Dec 6 05:25:32 localhost dnsmasq-dhcp[261331]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:32 localhost dnsmasq[261331]: read /var/lib/neutron/dhcp/17b45f00-da82-4acf-91c9-457de8bc1cf3/addn_hosts - 0 addresses Dec 6 05:25:32 localhost dnsmasq-dhcp[261331]: read /var/lib/neutron/dhcp/17b45f00-da82-4acf-91c9-457de8bc1cf3/host Dec 6 05:25:32 localhost dnsmasq-dhcp[261331]: read /var/lib/neutron/dhcp/17b45f00-da82-4acf-91c9-457de8bc1cf3/opts Dec 6 05:25:32 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:32.598 219384 INFO neutron.agent.dhcp.agent [None req-a554f624-dd0a-49ec-bdab-8c89a016af9a - - - - - -] DHCP configuration for ports {'bd230ad5-f3d0-4a0b-81de-b17e76a9620b'} is completed#033[00m Dec 6 05:25:33 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:33.914 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:25:34 localhost nova_compute[237281]: 2025-12-06 10:25:34.257 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:34 localhost dnsmasq[261077]: exiting on receipt of SIGTERM Dec 6 05:25:34 localhost podman[261347]: 2025-12-06 10:25:34.353409947 +0000 UTC m=+0.054866760 container kill f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:25:34 localhost systemd[1]: libpod-f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3.scope: Deactivated successfully. Dec 6 05:25:34 localhost podman[261361]: 2025-12-06 10:25:34.424259259 +0000 UTC m=+0.055115098 container died f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:34 localhost podman[261361]: 2025-12-06 10:25:34.468202412 +0000 UTC m=+0.099058211 container cleanup f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:34 localhost systemd[1]: libpod-conmon-f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3.scope: Deactivated successfully. Dec 6 05:25:34 localhost podman[261362]: 2025-12-06 10:25:34.509298318 +0000 UTC m=+0.133397809 container remove f7316189ed8fe06661ade22a324d8b0d0b6de36575a03d62c24e945dbbcec0e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:25:34 localhost ovn_controller[131684]: 2025-12-06T10:25:34Z|00400|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:25:34 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:34.588 2 INFO neutron.agent.securitygroups_rpc [None req-67759215-082c-462b-b01e-ccf4d2585cfd 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:34 localhost nova_compute[237281]: 2025-12-06 10:25:34.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:35 localhost systemd[1]: var-lib-containers-storage-overlay-dec99e4bbbbe7b23a8f5c79268503e62a27f9c7f350cf1242f79970c90c6b795-merged.mount: Deactivated successfully. Dec 6 05:25:35 localhost podman[261437]: Dec 6 05:25:35 localhost podman[261437]: 2025-12-06 10:25:35.699809035 +0000 UTC m=+0.086815805 container create edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:25:35 localhost systemd[1]: Started libpod-conmon-edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef.scope. Dec 6 05:25:35 localhost systemd[1]: Started libcrun container. Dec 6 05:25:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23f910df9e2f490ce207b81adda671364e6fdb49af5c7b4e80953e4251873bc5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:35 localhost podman[261437]: 2025-12-06 10:25:35.659903126 +0000 UTC m=+0.046909986 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:35 localhost podman[261437]: 2025-12-06 10:25:35.7671713 +0000 UTC m=+0.154178090 container init edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:25:35 localhost dnsmasq[261466]: started, version 2.85 cachesize 150 Dec 6 05:25:35 localhost dnsmasq[261466]: DNS service limited to local subnets Dec 6 05:25:35 localhost dnsmasq[261466]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:35 localhost dnsmasq[261466]: warning: no upstream servers configured Dec 6 05:25:35 localhost dnsmasq-dhcp[261466]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 6 05:25:35 localhost dnsmasq[261466]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 0 addresses Dec 6 05:25:35 localhost dnsmasq-dhcp[261466]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:35 localhost dnsmasq-dhcp[261466]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:35 localhost podman[261450]: 2025-12-06 10:25:35.809096271 +0000 UTC m=+0.077010074 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:25:35 localhost podman[261437]: 2025-12-06 10:25:35.827244089 +0000 UTC m=+0.214250879 container start edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:25:35 localhost podman[261450]: 2025-12-06 10:25:35.888344641 +0000 UTC m=+0.156258434 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller) Dec 6 05:25:35 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:25:36 localhost nova_compute[237281]: 2025-12-06 10:25:36.374 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:36 localhost ovn_controller[131684]: 2025-12-06T10:25:36Z|00401|binding|INFO|Releasing lport 00eda423-330c-492b-86dc-f6b450d68ab8 from this chassis (sb_readonly=0) Dec 6 05:25:36 localhost kernel: device tap00eda423-33 left promiscuous mode Dec 6 05:25:36 localhost ovn_controller[131684]: 2025-12-06T10:25:36Z|00402|binding|INFO|Setting lport 00eda423-330c-492b-86dc-f6b450d68ab8 down in Southbound Dec 6 05:25:36 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:36.386 219384 INFO neutron.agent.dhcp.agent [None req-ca21b0d3-2ce0-4a94-b0bd-08f2df56150d - - - - - -] DHCP configuration for ports {'f586d69b-90ba-43e8-b402-069c222110c3', '26d09c5d-5602-417f-baf9-28f7a2e5060c', '7c5159c1-bdcd-437d-bee5-b12f4e51cd5f'} is completed#033[00m Dec 6 05:25:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:36.388 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-17b45f00-da82-4acf-91c9-457de8bc1cf3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17b45f00-da82-4acf-91c9-457de8bc1cf3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=affdf30b-af34-4151-b040-96ef747bdacf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=00eda423-330c-492b-86dc-f6b450d68ab8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:36.390 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 00eda423-330c-492b-86dc-f6b450d68ab8 in datapath 17b45f00-da82-4acf-91c9-457de8bc1cf3 unbound from our chassis#033[00m Dec 6 05:25:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:36.394 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17b45f00-da82-4acf-91c9-457de8bc1cf3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:36.395 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[49bde30a-6a6e-4777-b507-67d3096a6941]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:36 localhost nova_compute[237281]: 2025-12-06 10:25:36.397 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:36 localhost nova_compute[237281]: 2025-12-06 10:25:36.399 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:36 localhost nova_compute[237281]: 2025-12-06 10:25:36.428 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:36 localhost dnsmasq[261466]: exiting on receipt of SIGTERM Dec 6 05:25:36 localhost podman[261500]: 2025-12-06 10:25:36.63441121 +0000 UTC m=+0.061818315 container kill edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:25:36 localhost systemd[1]: libpod-edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef.scope: Deactivated successfully. Dec 6 05:25:36 localhost podman[261515]: 2025-12-06 10:25:36.714745704 +0000 UTC m=+0.062009101 container died edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:36 localhost podman[261515]: 2025-12-06 10:25:36.749719172 +0000 UTC m=+0.096982529 container cleanup edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:36 localhost systemd[1]: libpod-conmon-edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef.scope: Deactivated successfully. Dec 6 05:25:36 localhost podman[261516]: 2025-12-06 10:25:36.79511178 +0000 UTC m=+0.136804655 container remove edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:25:37 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:37.042 2 INFO neutron.agent.securitygroups_rpc [None req-42b119b9-0ad9-4410-824a-3b65c540e3fd 8a4ff853ee5d42418d566e341750e650 8d53f9f811864008924718cc1c15ec91 - - default default] Security group member updated ['b6d839ba-ccd0-4121-8dfd-bf18fa4ee2f7']#033[00m Dec 6 05:25:37 localhost systemd[1]: var-lib-containers-storage-overlay-23f910df9e2f490ce207b81adda671364e6fdb49af5c7b4e80953e4251873bc5-merged.mount: Deactivated successfully. Dec 6 05:25:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-edb88b2b4f1759e189b88f3a64709783cbeb1556a2999632013a9f8cc82f9fef-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:37 localhost dnsmasq[261331]: read /var/lib/neutron/dhcp/17b45f00-da82-4acf-91c9-457de8bc1cf3/addn_hosts - 0 addresses Dec 6 05:25:37 localhost dnsmasq-dhcp[261331]: read /var/lib/neutron/dhcp/17b45f00-da82-4acf-91c9-457de8bc1cf3/host Dec 6 05:25:37 localhost dnsmasq-dhcp[261331]: read /var/lib/neutron/dhcp/17b45f00-da82-4acf-91c9-457de8bc1cf3/opts Dec 6 05:25:37 localhost podman[261561]: 2025-12-06 10:25:37.375730432 +0000 UTC m=+0.062385212 container kill 454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent [None req-8448ad94-bb61-474a-bdf5-814b394ff519 - - - - - -] Unable to reload_allocations dhcp for 17b45f00-da82-4acf-91c9-457de8bc1cf3.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap00eda423-33 not found in namespace qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3. Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent return fut.result() Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent raise self._exception Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap00eda423-33 not found in namespace qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3. Dec 6 05:25:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:37.399 219384 ERROR neutron.agent.dhcp.agent #033[00m Dec 6 05:25:38 localhost ovn_controller[131684]: 2025-12-06T10:25:38Z|00403|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:25:38 localhost nova_compute[237281]: 2025-12-06 10:25:38.669 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42670 DF PROTO=TCP SPT=54696 DPT=9102 SEQ=1458137816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE05D870000000001030307) Dec 6 05:25:39 localhost nova_compute[237281]: 2025-12-06 10:25:39.259 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:39 localhost podman[261623]: Dec 6 05:25:39 localhost podman[261623]: 2025-12-06 10:25:39.821008873 +0000 UTC m=+0.078342114 container create 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:25:39 localhost systemd[1]: Started libpod-conmon-99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10.scope. Dec 6 05:25:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:25:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:25:39 localhost systemd[1]: tmp-crun.SGf0LN.mount: Deactivated successfully. Dec 6 05:25:39 localhost systemd[1]: Started libcrun container. Dec 6 05:25:39 localhost podman[261623]: 2025-12-06 10:25:39.791726851 +0000 UTC m=+0.049060112 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45cc361c38a618706518365c9f7d30e5c30e0a90449992f8948cb5e49d619db9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:39 localhost podman[261640]: 2025-12-06 10:25:39.921514019 +0000 UTC m=+0.058567665 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:25:39 localhost podman[261641]: 2025-12-06 10:25:39.947930512 +0000 UTC m=+0.078031525 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Dec 6 05:25:39 localhost podman[261623]: 2025-12-06 10:25:39.953531724 +0000 UTC m=+0.210864955 container init 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:25:39 localhost podman[261623]: 2025-12-06 10:25:39.959172408 +0000 UTC m=+0.216505639 container start 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:25:39 localhost dnsmasq[261682]: started, version 2.85 cachesize 150 Dec 6 05:25:39 localhost dnsmasq[261682]: DNS service limited to local subnets Dec 6 05:25:39 localhost dnsmasq[261682]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:39 localhost dnsmasq[261682]: warning: no upstream servers configured Dec 6 05:25:39 localhost dnsmasq-dhcp[261682]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 6 05:25:39 localhost dnsmasq-dhcp[261682]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:39 localhost dnsmasq[261682]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 0 addresses Dec 6 05:25:39 localhost dnsmasq-dhcp[261682]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:39 localhost dnsmasq-dhcp[261682]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:39 localhost podman[261641]: 2025-12-06 10:25:39.97998274 +0000 UTC m=+0.110083773 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:25:39 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:25:40 localhost podman[261640]: 2025-12-06 10:25:40.00080545 +0000 UTC m=+0.137859086 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:25:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:40.019 219384 INFO neutron.agent.dhcp.agent [None req-e0a05e59-4e89-465b-b8a9-354d25878467 - - - - - -] Synchronizing state#033[00m Dec 6 05:25:40 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:25:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:40.520 219384 INFO neutron.agent.dhcp.agent [None req-ccc20bf9-387b-458b-9b95-776e6617c563 - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 6 05:25:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:40.525 219384 INFO neutron.agent.dhcp.agent [None req-9a23e24d-4d8a-4400-bbd0-544341e4f9ce - - - - - -] DHCP configuration for ports {'f586d69b-90ba-43e8-b402-069c222110c3', '26d09c5d-5602-417f-baf9-28f7a2e5060c', '7c5159c1-bdcd-437d-bee5-b12f4e51cd5f'} is completed#033[00m Dec 6 05:25:40 localhost dnsmasq[261331]: exiting on receipt of SIGTERM Dec 6 05:25:40 localhost podman[261701]: 2025-12-06 10:25:40.697642893 +0000 UTC m=+0.060292128 container kill 454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:40 localhost systemd[1]: libpod-454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2.scope: Deactivated successfully. Dec 6 05:25:40 localhost podman[261717]: 2025-12-06 10:25:40.775731488 +0000 UTC m=+0.057951756 container died 454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 05:25:40 localhost podman[261717]: 2025-12-06 10:25:40.819889768 +0000 UTC m=+0.102110006 container remove 454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-17b45f00-da82-4acf-91c9-457de8bc1cf3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:40 localhost systemd[1]: var-lib-containers-storage-overlay-1fb9d0be06ed930003bcb03d86ebc83ac6987884b8e22d33c1938e87331769f0-merged.mount: Deactivated successfully. Dec 6 05:25:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:40 localhost systemd[1]: run-netns-qdhcp\x2d17b45f00\x2dda82\x2d4acf\x2d91c9\x2d457de8bc1cf3.mount: Deactivated successfully. Dec 6 05:25:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:40.849 219384 INFO neutron.agent.dhcp.agent [None req-39e490fd-d7cb-4139-8665-223b725de614 - - - - - -] Synchronizing state complete#033[00m Dec 6 05:25:40 localhost systemd[1]: libpod-conmon-454a7b11ff28e4a2154e73dbbff55c3d7d2260ef4fa9c268a3107ac47bbb53e2.scope: Deactivated successfully. Dec 6 05:25:41 localhost dnsmasq[260204]: exiting on receipt of SIGTERM Dec 6 05:25:41 localhost podman[261760]: 2025-12-06 10:25:41.326509541 +0000 UTC m=+0.066045125 container kill 9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9817d242-662d-4582-8688-d255fc1e06de, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:41 localhost systemd[1]: libpod-9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e.scope: Deactivated successfully. Dec 6 05:25:41 localhost podman[261772]: 2025-12-06 10:25:41.395942789 +0000 UTC m=+0.053263031 container died 9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9817d242-662d-4582-8688-d255fc1e06de, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:25:41 localhost podman[261772]: 2025-12-06 10:25:41.425705076 +0000 UTC m=+0.083025288 container cleanup 9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9817d242-662d-4582-8688-d255fc1e06de, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:41 localhost systemd[1]: libpod-conmon-9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e.scope: Deactivated successfully. Dec 6 05:25:41 localhost nova_compute[237281]: 2025-12-06 10:25:41.431 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:41 localhost podman[261774]: 2025-12-06 10:25:41.47583044 +0000 UTC m=+0.128209350 container remove 9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9817d242-662d-4582-8688-d255fc1e06de, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:41 localhost ovn_controller[131684]: 2025-12-06T10:25:41Z|00404|binding|INFO|Releasing lport a47e84c3-ee53-41c0-8fdf-3f33f89ebebe from this chassis (sb_readonly=0) Dec 6 05:25:41 localhost ovn_controller[131684]: 2025-12-06T10:25:41Z|00405|binding|INFO|Setting lport a47e84c3-ee53-41c0-8fdf-3f33f89ebebe down in Southbound Dec 6 05:25:41 localhost nova_compute[237281]: 2025-12-06 10:25:41.488 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:41 localhost kernel: device tapa47e84c3-ee left promiscuous mode Dec 6 05:25:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:41.503 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9817d242-662d-4582-8688-d255fc1e06de', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9817d242-662d-4582-8688-d255fc1e06de', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5edd94b688c144078712f567f790b3e9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=84f57985-5441-4394-8428-4c50a8122e29, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a47e84c3-ee53-41c0-8fdf-3f33f89ebebe) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:41.506 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a47e84c3-ee53-41c0-8fdf-3f33f89ebebe in datapath 9817d242-662d-4582-8688-d255fc1e06de unbound from our chassis#033[00m Dec 6 05:25:41 localhost nova_compute[237281]: 2025-12-06 10:25:41.506 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:41 localhost nova_compute[237281]: 2025-12-06 10:25:41.507 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:41.511 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9817d242-662d-4582-8688-d255fc1e06de, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:41 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:41.512 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[72241422-fc16-45b6-a736-fb4557331aea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:41.538 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:25:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:41.778 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:25:41 localhost systemd[1]: var-lib-containers-storage-overlay-f41295a9174b7be42f1016ee9145cdf564f6dccf725c161ab953f6544fedeb67-merged.mount: Deactivated successfully. Dec 6 05:25:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9aa9772b0d5c4fa01c4089dfdd5613ff2a5914b53f7f68f7a0f28a99ec66910e-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:41 localhost systemd[1]: run-netns-qdhcp\x2d9817d242\x2d662d\x2d4582\x2d8688\x2dd255fc1e06de.mount: Deactivated successfully. Dec 6 05:25:41 localhost ovn_controller[131684]: 2025-12-06T10:25:41Z|00406|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:25:42 localhost nova_compute[237281]: 2025-12-06 10:25:42.039 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:43 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:43.721 2 INFO neutron.agent.securitygroups_rpc [None req-581dc43e-0b7e-42c5-b8a6-47b6440b2dad 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['473e9948-68da-48ef-a810-1cc684483525']#033[00m Dec 6 05:25:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:43.769 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:25:43Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f0426ccd-50ae-4e15-84ad-2c8adef33e5c, ip_allocation=immediate, mac_address=fa:16:3e:6d:98:95, name=tempest-PortsTestJSON-119991413, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:26Z, description=, dns_domain=, id=9a9f536a-4201-4d67-a433-6077de86991e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-306893013, port_security_enabled=True, project_id=3f8ef38a4bec46d18248142804d6d2a3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42595, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=1621, status=ACTIVE, subnets=['7cc085e3-a932-4ad8-abc6-9d8a867de584', 'a51b2a03-66e9-4297-b9bb-32acd69efa14'], tags=[], tenant_id=3f8ef38a4bec46d18248142804d6d2a3, updated_at=2025-12-06T10:25:35Z, vlan_transparent=None, network_id=9a9f536a-4201-4d67-a433-6077de86991e, port_security_enabled=True, project_id=3f8ef38a4bec46d18248142804d6d2a3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['473e9948-68da-48ef-a810-1cc684483525'], standard_attr_id=2218, status=DOWN, tags=[], tenant_id=3f8ef38a4bec46d18248142804d6d2a3, updated_at=2025-12-06T10:25:43Z on network 9a9f536a-4201-4d67-a433-6077de86991e#033[00m Dec 6 05:25:44 localhost dnsmasq[261682]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 1 addresses Dec 6 05:25:44 localhost dnsmasq-dhcp[261682]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:44 localhost dnsmasq-dhcp[261682]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:44 localhost podman[261819]: 2025-12-06 10:25:44.105192251 +0000 UTC m=+0.060562966 container kill 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:25:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:25:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:25:44 localhost podman[261834]: 2025-12-06 10:25:44.258939367 +0000 UTC m=+0.115854059 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 6 05:25:44 localhost nova_compute[237281]: 2025-12-06 10:25:44.260 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:44 localhost podman[261833]: 2025-12-06 10:25:44.230024516 +0000 UTC m=+0.092329504 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:25:44 localhost podman[261834]: 2025-12-06 10:25:44.301384874 +0000 UTC m=+0.158299526 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:25:44 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:25:44 localhost podman[261833]: 2025-12-06 10:25:44.362980231 +0000 UTC m=+0.225285249 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:44 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:25:44 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:44.445 219384 INFO neutron.agent.dhcp.agent [None req-94bed88c-56a0-4dd1-99c7-d6ad77ea94d9 - - - - - -] DHCP configuration for ports {'f0426ccd-50ae-4e15-84ad-2c8adef33e5c'} is completed#033[00m Dec 6 05:25:45 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:45.938 2 INFO neutron.agent.securitygroups_rpc [None req-2b946c46-627e-4288-86da-1832457b6445 c695016aff2c482f9d1442420397e588 077494e5b56a49e2a7c273a073b20032 - - default default] Security group member updated ['0cb54f74-cbb1-4a8e-8cca-34c4bd951da0']#033[00m Dec 6 05:25:46 localhost openstack_network_exporter[199751]: ERROR 10:25:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:25:46 localhost openstack_network_exporter[199751]: ERROR 10:25:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:25:46 localhost openstack_network_exporter[199751]: ERROR 10:25:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:25:46 localhost openstack_network_exporter[199751]: ERROR 10:25:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:25:46 localhost openstack_network_exporter[199751]: Dec 6 05:25:46 localhost openstack_network_exporter[199751]: ERROR 10:25:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:25:46 localhost openstack_network_exporter[199751]: Dec 6 05:25:46 localhost nova_compute[237281]: 2025-12-06 10:25:46.435 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:46 localhost ovn_controller[131684]: 2025-12-06T10:25:46Z|00407|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:25:46 localhost nova_compute[237281]: 2025-12-06 10:25:46.754 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:48 localhost dnsmasq[261682]: exiting on receipt of SIGTERM Dec 6 05:25:48 localhost podman[261891]: 2025-12-06 10:25:48.096306744 +0000 UTC m=+0.060198555 container kill 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:25:48 localhost systemd[1]: libpod-99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10.scope: Deactivated successfully. Dec 6 05:25:48 localhost podman[261905]: 2025-12-06 10:25:48.171358776 +0000 UTC m=+0.061596038 container died 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10-userdata-shm.mount: Deactivated successfully. Dec 6 05:25:48 localhost podman[261905]: 2025-12-06 10:25:48.209111098 +0000 UTC m=+0.099348320 container cleanup 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:48 localhost systemd[1]: libpod-conmon-99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10.scope: Deactivated successfully. Dec 6 05:25:48 localhost podman[261907]: 2025-12-06 10:25:48.252220747 +0000 UTC m=+0.132881565 container remove 99bc3bb02a3bc54f08ef049b92cdc8dc3f6017571dc6c869dea44c8622aecf10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:25:49 localhost systemd[1]: var-lib-containers-storage-overlay-45cc361c38a618706518365c9f7d30e5c30e0a90449992f8948cb5e49d619db9-merged.mount: Deactivated successfully. Dec 6 05:25:49 localhost nova_compute[237281]: 2025-12-06 10:25:49.264 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:49 localhost nova_compute[237281]: 2025-12-06 10:25:49.779 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:50.320 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:cd:f0 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6d270b4-86bf-4d3c-9534-fc16b2336e09, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f586d69b-90ba-43e8-b402-069c222110c3) old=Port_Binding(mac=['fa:16:3e:c6:cd:f0 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:25:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:50.323 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f586d69b-90ba-43e8-b402-069c222110c3 in datapath 9a9f536a-4201-4d67-a433-6077de86991e updated#033[00m Dec 6 05:25:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:50.326 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4372fc8b-381e-4e0c-a172-ae8fb0ab8c0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:25:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:50.326 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a9f536a-4201-4d67-a433-6077de86991e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:25:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:25:50.328 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c504291d-68b6-4ec0-a74b-ad918301b084]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:25:51 localhost nova_compute[237281]: 2025-12-06 10:25:51.438 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:51 localhost nova_compute[237281]: 2025-12-06 10:25:51.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:52 localhost podman[261982]: Dec 6 05:25:52 localhost podman[261982]: 2025-12-06 10:25:52.233711231 +0000 UTC m=+0.095582955 container create 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:25:52 localhost systemd[1]: Started libpod-conmon-10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa.scope. Dec 6 05:25:52 localhost podman[261982]: 2025-12-06 10:25:52.186423784 +0000 UTC m=+0.048295538 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:25:52 localhost systemd[1]: tmp-crun.VjHOZt.mount: Deactivated successfully. Dec 6 05:25:52 localhost systemd[1]: Started libcrun container. Dec 6 05:25:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb54c1a0a572a32380d05dc8ad3a529798b36850094359a97a727774bfe0d21/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:25:52 localhost podman[261982]: 2025-12-06 10:25:52.338051884 +0000 UTC m=+0.199923628 container init 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:25:52 localhost podman[261982]: 2025-12-06 10:25:52.350777307 +0000 UTC m=+0.212649021 container start 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:25:52 localhost dnsmasq[262001]: started, version 2.85 cachesize 150 Dec 6 05:25:52 localhost dnsmasq[262001]: DNS service limited to local subnets Dec 6 05:25:52 localhost dnsmasq[262001]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:25:52 localhost dnsmasq[262001]: warning: no upstream servers configured Dec 6 05:25:52 localhost dnsmasq-dhcp[262001]: DHCP, static leases only on 10.100.0.32, lease time 1d Dec 6 05:25:52 localhost dnsmasq-dhcp[262001]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 6 05:25:52 localhost dnsmasq-dhcp[262001]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:25:52 localhost dnsmasq[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 1 addresses Dec 6 05:25:52 localhost dnsmasq-dhcp[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:52 localhost dnsmasq-dhcp[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:52 localhost nova_compute[237281]: 2025-12-06 10:25:52.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:53 localhost podman[197801]: time="2025-12-06T10:25:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:25:53 localhost podman[197801]: @ - - [06/Dec/2025:10:25:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153573 "" "Go-http-client/1.1" Dec 6 05:25:53 localhost podman[197801]: @ - - [06/Dec/2025:10:25:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18351 "" "Go-http-client/1.1" Dec 6 05:25:53 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:53.452 219384 INFO neutron.agent.dhcp.agent [None req-a9aebe93-a797-4702-b1eb-f0dfc50f5c5a - - - - - -] DHCP configuration for ports {'f0426ccd-50ae-4e15-84ad-2c8adef33e5c', 'f586d69b-90ba-43e8-b402-069c222110c3', '26d09c5d-5602-417f-baf9-28f7a2e5060c', '7c5159c1-bdcd-437d-bee5-b12f4e51cd5f'} is completed#033[00m Dec 6 05:25:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44540 DF PROTO=TCP SPT=35296 DPT=9102 SEQ=3744744917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE0975A0000000001030307) Dec 6 05:25:54 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:54.016 2 INFO neutron.agent.securitygroups_rpc [None req-2f52a6e0-af02-49ea-b86f-c5e30d71437a 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['5d408276-a691-4db7-bfcd-679006679e03', 'beb1acbb-ade8-45bc-84fd-a450ef278e64', '473e9948-68da-48ef-a810-1cc684483525']#033[00m Dec 6 05:25:54 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:54.049 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:25:43Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f0426ccd-50ae-4e15-84ad-2c8adef33e5c, ip_allocation=immediate, mac_address=fa:16:3e:6d:98:95, name=tempest-PortsTestJSON-1115954012, network_id=9a9f536a-4201-4d67-a433-6077de86991e, port_security_enabled=True, project_id=3f8ef38a4bec46d18248142804d6d2a3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['5d408276-a691-4db7-bfcd-679006679e03', 'beb1acbb-ade8-45bc-84fd-a450ef278e64'], standard_attr_id=2218, status=DOWN, tags=[], tenant_id=3f8ef38a4bec46d18248142804d6d2a3, updated_at=2025-12-06T10:25:53Z on network 9a9f536a-4201-4d67-a433-6077de86991e#033[00m Dec 6 05:25:54 localhost dnsmasq-dhcp[262001]: DHCPRELEASE(tap7c5159c1-bd) 10.100.0.4 fa:16:3e:6d:98:95 Dec 6 05:25:54 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:54.214 2 INFO neutron.agent.securitygroups_rpc [None req-e7be6742-149d-4e09-8030-12a49a3dae7d 2e2429a207314407a8b8572df7927778 da648b133d3c45bcadbe795093c0fa10 - - default default] Security group member updated ['3b5cc3fc-91fe-4c82-a4a4-0b99a3df845f']#033[00m Dec 6 05:25:54 localhost nova_compute[237281]: 2025-12-06 10:25:54.267 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:54 localhost dnsmasq[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 1 addresses Dec 6 05:25:54 localhost dnsmasq-dhcp[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:54 localhost dnsmasq-dhcp[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:54 localhost podman[262020]: 2025-12-06 10:25:54.682000046 +0000 UTC m=+0.067565602 container kill 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:25:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:25:54 localhost podman[262033]: 2025-12-06 10:25:54.807697698 +0000 UTC m=+0.099537427 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible) Dec 6 05:25:54 localhost podman[262033]: 2025-12-06 10:25:54.823259587 +0000 UTC m=+0.115099346 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter) Dec 6 05:25:54 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:25:54 localhost nova_compute[237281]: 2025-12-06 10:25:54.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:54 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:25:54.981 219384 INFO neutron.agent.dhcp.agent [None req-ba225bd0-a251-49fd-84c0-fe7d425fdb2d - - - - - -] DHCP configuration for ports {'f0426ccd-50ae-4e15-84ad-2c8adef33e5c'} is completed#033[00m Dec 6 05:25:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44541 DF PROTO=TCP SPT=35296 DPT=9102 SEQ=3744744917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE09B470000000001030307) Dec 6 05:25:55 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:55.537 2 INFO neutron.agent.securitygroups_rpc [None req-d64b4177-f634-46b4-8ecc-84b96cb8c945 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['5d408276-a691-4db7-bfcd-679006679e03', 'beb1acbb-ade8-45bc-84fd-a450ef278e64']#033[00m Dec 6 05:25:55 localhost neutron_sriov_agent[212548]: 2025-12-06 10:25:55.539 2 INFO neutron.agent.securitygroups_rpc [None req-8da76e40-4861-435b-b490-3e9a4e7ee8af c695016aff2c482f9d1442420397e588 077494e5b56a49e2a7c273a073b20032 - - default default] Security group member updated ['0cb54f74-cbb1-4a8e-8cca-34c4bd951da0']#033[00m Dec 6 05:25:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42671 DF PROTO=TCP SPT=54696 DPT=9102 SEQ=1458137816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE09D880000000001030307) Dec 6 05:25:55 localhost nova_compute[237281]: 2025-12-06 10:25:55.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:56 localhost dnsmasq[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 0 addresses Dec 6 05:25:56 localhost dnsmasq-dhcp[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:25:56 localhost podman[262077]: 2025-12-06 10:25:56.158519311 +0000 UTC m=+0.057216993 container kill 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:25:56 localhost dnsmasq-dhcp[262001]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:25:56 localhost nova_compute[237281]: 2025-12-06 10:25:56.441 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:56 localhost nova_compute[237281]: 2025-12-06 10:25:56.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:25:56 localhost nova_compute[237281]: 2025-12-06 10:25:56.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:25:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44542 DF PROTO=TCP SPT=35296 DPT=9102 SEQ=3744744917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE0A3620000000001030307) Dec 6 05:25:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57188 DF PROTO=TCP SPT=47484 DPT=9102 SEQ=3176948678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE0A7870000000001030307) Dec 6 05:25:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:25:58 localhost podman[262099]: 2025-12-06 10:25:58.545379783 +0000 UTC m=+0.078907621 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:25:58 localhost podman[262099]: 2025-12-06 10:25:58.556466755 +0000 UTC m=+0.089994593 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:25:58 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:25:59 localhost nova_compute[237281]: 2025-12-06 10:25:59.269 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:25:59 localhost nova_compute[237281]: 2025-12-06 10:25:59.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:00 localhost nova_compute[237281]: 2025-12-06 10:26:00.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:00 localhost nova_compute[237281]: 2025-12-06 10:26:00.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:26:00 localhost nova_compute[237281]: 2025-12-06 10:26:00.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:26:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44543 DF PROTO=TCP SPT=35296 DPT=9102 SEQ=3744744917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE0B3080000000001030307) Dec 6 05:26:01 localhost nova_compute[237281]: 2025-12-06 10:26:01.285 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:26:01 localhost nova_compute[237281]: 2025-12-06 10:26:01.285 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:26:01 localhost nova_compute[237281]: 2025-12-06 10:26:01.286 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:26:01 localhost nova_compute[237281]: 2025-12-06 10:26:01.286 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:26:01 localhost nova_compute[237281]: 2025-12-06 10:26:01.445 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:03 localhost dnsmasq[262001]: exiting on receipt of SIGTERM Dec 6 05:26:03 localhost podman[262138]: 2025-12-06 10:26:03.929093227 +0000 UTC m=+0.061823175 container kill 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:26:03 localhost systemd[1]: libpod-10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa.scope: Deactivated successfully. Dec 6 05:26:04 localhost podman[262151]: 2025-12-06 10:26:04.009571535 +0000 UTC m=+0.067425477 container died 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 05:26:04 localhost systemd[1]: tmp-crun.t6e9bi.mount: Deactivated successfully. Dec 6 05:26:04 localhost podman[262151]: 2025-12-06 10:26:04.054698426 +0000 UTC m=+0.112552318 container cleanup 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:26:04 localhost systemd[1]: libpod-conmon-10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa.scope: Deactivated successfully. Dec 6 05:26:04 localhost podman[262153]: 2025-12-06 10:26:04.134241885 +0000 UTC m=+0.182591664 container remove 10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 05:26:04 localhost ovn_controller[131684]: 2025-12-06T10:26:04Z|00408|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:04 localhost nova_compute[237281]: 2025-12-06 10:26:04.270 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:04 localhost nova_compute[237281]: 2025-12-06 10:26:04.280 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:04 localhost systemd[1]: var-lib-containers-storage-overlay-bfb54c1a0a572a32380d05dc8ad3a529798b36850094359a97a727774bfe0d21-merged.mount: Deactivated successfully. Dec 6 05:26:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10b1f37f85e46f6f0d4bc0c3727b5e109dbd0694f4c4bcb76120f64357d626aa-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:05 localhost podman[262227]: Dec 6 05:26:05 localhost podman[262227]: 2025-12-06 10:26:05.42597538 +0000 UTC m=+0.085211915 container create 47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0) Dec 6 05:26:05 localhost systemd[1]: Started libpod-conmon-47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92.scope. Dec 6 05:26:05 localhost podman[262227]: 2025-12-06 10:26:05.37924294 +0000 UTC m=+0.038479495 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:26:05 localhost systemd[1]: tmp-crun.YJxydt.mount: Deactivated successfully. Dec 6 05:26:05 localhost systemd[1]: Started libcrun container. Dec 6 05:26:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e29c4b8b319f701c2ac41df75023d8833b7a4449110fb6666f52ce64f20f161/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:26:05 localhost podman[262227]: 2025-12-06 10:26:05.502965211 +0000 UTC m=+0.162201736 container init 47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:26:05 localhost podman[262227]: 2025-12-06 10:26:05.516014073 +0000 UTC m=+0.175250598 container start 47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:26:05 localhost dnsmasq[262245]: started, version 2.85 cachesize 150 Dec 6 05:26:05 localhost dnsmasq[262245]: DNS service limited to local subnets Dec 6 05:26:05 localhost dnsmasq[262245]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:26:05 localhost dnsmasq[262245]: warning: no upstream servers configured Dec 6 05:26:05 localhost dnsmasq-dhcp[262245]: DHCP, static leases only on 10.100.0.32, lease time 1d Dec 6 05:26:05 localhost dnsmasq-dhcp[262245]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 6 05:26:05 localhost dnsmasq[262245]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 0 addresses Dec 6 05:26:05 localhost dnsmasq-dhcp[262245]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:26:05 localhost dnsmasq-dhcp[262245]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:26:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:26:06 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:06.022 219384 INFO neutron.agent.dhcp.agent [None req-1d78945a-cdc5-4c0d-b958-63060dbb441d - - - - - -] DHCP configuration for ports {'f586d69b-90ba-43e8-b402-069c222110c3', '26d09c5d-5602-417f-baf9-28f7a2e5060c', '7c5159c1-bdcd-437d-bee5-b12f4e51cd5f'} is completed#033[00m Dec 6 05:26:06 localhost podman[262246]: 2025-12-06 10:26:06.03772645 +0000 UTC m=+0.086026959 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2) Dec 6 05:26:06 localhost podman[262246]: 2025-12-06 10:26:06.153433905 +0000 UTC m=+0.201734384 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:26:06 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.207 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:26:06 localhost podman[262288]: 2025-12-06 10:26:06.239606088 +0000 UTC m=+0.064159186 container kill 47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125) Dec 6 05:26:06 localhost dnsmasq[262245]: exiting on receipt of SIGTERM Dec 6 05:26:06 localhost systemd[1]: libpod-47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92.scope: Deactivated successfully. Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.252 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.254 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.255 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.308 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.309 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.309 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.310 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:26:06 localhost podman[262302]: 2025-12-06 10:26:06.311367589 +0000 UTC m=+0.057291465 container died 47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:26:06 localhost podman[262302]: 2025-12-06 10:26:06.360484662 +0000 UTC m=+0.106408488 container cleanup 47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:26:06 localhost systemd[1]: libpod-conmon-47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92.scope: Deactivated successfully. Dec 6 05:26:06 localhost podman[262304]: 2025-12-06 10:26:06.406325753 +0000 UTC m=+0.140548949 container remove 47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.448 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.463 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.532 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.534 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.627 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.629 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.710 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.712 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.713 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.714 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.715 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.747 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:cd:f0 10.100.0.18'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'ovnmeta-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6d270b4-86bf-4d3c-9534-fc16b2336e09, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f586d69b-90ba-43e8-b402-069c222110c3) old=Port_Binding(mac=['fa:16:3e:c6:cd:f0 10.100.0.18 10.100.0.2 10.100.0.34'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.750 137259 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f586d69b-90ba-43e8-b402-069c222110c3 in datapath 9a9f536a-4201-4d67-a433-6077de86991e updated#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.753 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4372fc8b-381e-4e0c-a172-ae8fb0ab8c0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.753 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a9f536a-4201-4d67-a433-6077de86991e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:26:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:06.754 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[bd842286-f232-435a-ae85-a5644d650e60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:06 localhost nova_compute[237281]: 2025-12-06 10:26:06.776 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:26:06 localhost systemd[1]: tmp-crun.YVHv1x.mount: Deactivated successfully. Dec 6 05:26:06 localhost systemd[1]: var-lib-containers-storage-overlay-3e29c4b8b319f701c2ac41df75023d8833b7a4449110fb6666f52ce64f20f161-merged.mount: Deactivated successfully. Dec 6 05:26:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47905d95b90f66e7f27b41335bcf522c7e65481fd4710cfaedbee9ded65ffd92-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.025 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.027 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12224MB free_disk=387.2660827636719GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.027 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.028 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.120 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.121 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.121 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.210 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.233 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.236 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:26:07 localhost nova_compute[237281]: 2025-12-06 10:26:07.236 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:26:07 localhost sshd[262369]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:26:07 localhost podman[262392]: Dec 6 05:26:07 localhost podman[262392]: 2025-12-06 10:26:07.610306305 +0000 UTC m=+0.086657090 container create 13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:26:07 localhost systemd[1]: Started libpod-conmon-13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870.scope. Dec 6 05:26:07 localhost systemd[1]: Started libcrun container. Dec 6 05:26:07 localhost podman[262392]: 2025-12-06 10:26:07.568785366 +0000 UTC m=+0.045136191 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:26:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d343b0484d64f53b9ea2d32c08405b9e363e4dab4e8e4f703914be87f364e43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:26:07 localhost podman[262392]: 2025-12-06 10:26:07.677612228 +0000 UTC m=+0.153963043 container init 13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:26:07 localhost podman[262392]: 2025-12-06 10:26:07.68804787 +0000 UTC m=+0.164398685 container start 13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 05:26:07 localhost dnsmasq[262411]: started, version 2.85 cachesize 150 Dec 6 05:26:07 localhost dnsmasq[262411]: DNS service limited to local subnets Dec 6 05:26:07 localhost dnsmasq[262411]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:26:07 localhost dnsmasq[262411]: warning: no upstream servers configured Dec 6 05:26:07 localhost dnsmasq-dhcp[262411]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 6 05:26:07 localhost dnsmasq[262411]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/addn_hosts - 0 addresses Dec 6 05:26:07 localhost dnsmasq-dhcp[262411]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/host Dec 6 05:26:07 localhost dnsmasq-dhcp[262411]: read /var/lib/neutron/dhcp/9a9f536a-4201-4d67-a433-6077de86991e/opts Dec 6 05:26:08 localhost podman[262429]: 2025-12-06 10:26:08.612347676 +0000 UTC m=+0.062104283 container kill 13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:26:08 localhost dnsmasq[262411]: exiting on receipt of SIGTERM Dec 6 05:26:08 localhost systemd[1]: libpod-13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870.scope: Deactivated successfully. Dec 6 05:26:08 localhost podman[262442]: 2025-12-06 10:26:08.689165863 +0000 UTC m=+0.057860624 container died 13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 6 05:26:08 localhost systemd[1]: tmp-crun.EvoTPN.mount: Deactivated successfully. Dec 6 05:26:08 localhost podman[262442]: 2025-12-06 10:26:08.730323181 +0000 UTC m=+0.099017922 container cleanup 13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:26:08 localhost systemd[1]: libpod-conmon-13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870.scope: Deactivated successfully. Dec 6 05:26:08 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:08.766 219384 INFO neutron.agent.dhcp.agent [None req-690f279b-6b0b-43c7-95d3-0d8d38099df2 - - - - - -] DHCP configuration for ports {'f586d69b-90ba-43e8-b402-069c222110c3', '26d09c5d-5602-417f-baf9-28f7a2e5060c', '7c5159c1-bdcd-437d-bee5-b12f4e51cd5f'} is completed#033[00m Dec 6 05:26:08 localhost podman[262443]: 2025-12-06 10:26:08.769114725 +0000 UTC m=+0.134190563 container remove 13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9a9f536a-4201-4d67-a433-6077de86991e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:26:08 localhost ovn_controller[131684]: 2025-12-06T10:26:08Z|00409|binding|INFO|Releasing lport 7c5159c1-bdcd-437d-bee5-b12f4e51cd5f from this chassis (sb_readonly=0) Dec 6 05:26:08 localhost ovn_controller[131684]: 2025-12-06T10:26:08Z|00410|binding|INFO|Setting lport 7c5159c1-bdcd-437d-bee5-b12f4e51cd5f down in Southbound Dec 6 05:26:08 localhost kernel: device tap7c5159c1-bd left promiscuous mode Dec 6 05:26:08 localhost nova_compute[237281]: 2025-12-06 10:26:08.835 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:08.845 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28 10.100.0.35/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a9f536a-4201-4d67-a433-6077de86991e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f8ef38a4bec46d18248142804d6d2a3', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f6d270b4-86bf-4d3c-9534-fc16b2336e09, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=7c5159c1-bdcd-437d-bee5-b12f4e51cd5f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:08.847 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 7c5159c1-bdcd-437d-bee5-b12f4e51cd5f in datapath 9a9f536a-4201-4d67-a433-6077de86991e unbound from our chassis#033[00m Dec 6 05:26:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:08.850 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a9f536a-4201-4d67-a433-6077de86991e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:26:08 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:08.852 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4f263d72-8475-4106-a7e8-1b2bd2c0d6c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:08 localhost nova_compute[237281]: 2025-12-06 10:26:08.854 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:08 localhost systemd[1]: var-lib-containers-storage-overlay-3d343b0484d64f53b9ea2d32c08405b9e363e4dab4e8e4f703914be87f364e43-merged.mount: Deactivated successfully. Dec 6 05:26:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13f09730768bd78ddb350b2ad64c6f7ae2d67f717a0e3028ace85c1c7a48c870-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:09 localhost nova_compute[237281]: 2025-12-06 10:26:09.272 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44544 DF PROTO=TCP SPT=35296 DPT=9102 SEQ=3744744917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE0D3870000000001030307) Dec 6 05:26:10 localhost systemd[1]: run-netns-qdhcp\x2d9a9f536a\x2d4201\x2d4d67\x2da433\x2d6077de86991e.mount: Deactivated successfully. Dec 6 05:26:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:10.041 219384 INFO neutron.agent.dhcp.agent [None req-05af263e-c8a4-42a1-af65-7b8e9fc46a14 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:26:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:26:10 localhost podman[262473]: 2025-12-06 10:26:10.155791232 +0000 UTC m=+0.092277593 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:26:10 localhost podman[262473]: 2025-12-06 10:26:10.166374149 +0000 UTC m=+0.102860470 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:26:10 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:26:10 localhost podman[262474]: 2025-12-06 10:26:10.257057272 +0000 UTC m=+0.186717892 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:26:10 localhost podman[262474]: 2025-12-06 10:26:10.299273022 +0000 UTC m=+0.228933632 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Dec 6 05:26:10 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:26:11 localhost neutron_sriov_agent[212548]: 2025-12-06 10:26:11.298 2 INFO neutron.agent.securitygroups_rpc [None req-2305586d-6496-4d25-a773-8f3befb5af81 79502b46c5234e6f991696408907262b 3f8ef38a4bec46d18248142804d6d2a3 - - default default] Security group member updated ['0c85205b-8f3f-4681-b537-0f7810642ab2']#033[00m Dec 6 05:26:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:11.355 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:11 localhost nova_compute[237281]: 2025-12-06 10:26:11.451 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:11.991 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:12 localhost ovn_controller[131684]: 2025-12-06T10:26:12Z|00411|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:12 localhost nova_compute[237281]: 2025-12-06 10:26:12.357 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:13 localhost nova_compute[237281]: 2025-12-06 10:26:12.999 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:13.000 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:13.001 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:26:13 localhost nova_compute[237281]: 2025-12-06 10:26:13.790 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:14 localhost nova_compute[237281]: 2025-12-06 10:26:14.273 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:26:14 localhost podman[262516]: 2025-12-06 10:26:14.560624078 +0000 UTC m=+0.089316232 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:26:14 localhost podman[262516]: 2025-12-06 10:26:14.594496371 +0000 UTC m=+0.123188575 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 6 05:26:14 localhost systemd[1]: tmp-crun.mgQKu1.mount: Deactivated successfully. Dec 6 05:26:14 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:26:14 localhost podman[262517]: 2025-12-06 10:26:14.626391573 +0000 UTC m=+0.152762305 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:26:14 localhost podman[262517]: 2025-12-06 10:26:14.639543558 +0000 UTC m=+0.165914300 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 05:26:14 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:26:14 localhost sshd[262554]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:26:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:16.003 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:26:16 localhost openstack_network_exporter[199751]: ERROR 10:26:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:26:16 localhost openstack_network_exporter[199751]: ERROR 10:26:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:26:16 localhost openstack_network_exporter[199751]: ERROR 10:26:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:26:16 localhost openstack_network_exporter[199751]: ERROR 10:26:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:26:16 localhost openstack_network_exporter[199751]: Dec 6 05:26:16 localhost openstack_network_exporter[199751]: ERROR 10:26:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:26:16 localhost openstack_network_exporter[199751]: Dec 6 05:26:16 localhost nova_compute[237281]: 2025-12-06 10:26:16.455 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:16 localhost podman[262573]: 2025-12-06 10:26:16.996915702 +0000 UTC m=+0.052984792 container kill f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:26:16 localhost dnsmasq[260513]: exiting on receipt of SIGTERM Dec 6 05:26:16 localhost systemd[1]: libpod-f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660.scope: Deactivated successfully. Dec 6 05:26:17 localhost podman[262586]: 2025-12-06 10:26:17.07251174 +0000 UTC m=+0.060768132 container died f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:26:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:17 localhost podman[262586]: 2025-12-06 10:26:17.106106865 +0000 UTC m=+0.094363207 container cleanup f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:26:17 localhost systemd[1]: libpod-conmon-f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660.scope: Deactivated successfully. Dec 6 05:26:17 localhost podman[262588]: 2025-12-06 10:26:17.157719215 +0000 UTC m=+0.134708540 container remove f9e16b64f5e49cc1a00e1a3fe53c36ba05030def6139f6d0528e99af8f31c660 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:17 localhost systemd[1]: var-lib-containers-storage-overlay-8c6294039cb6d6d67f95e65a251717bbbae50ef46b22e9ec6fb2da97270a688d-merged.mount: Deactivated successfully. Dec 6 05:26:18 localhost podman[262664]: Dec 6 05:26:18 localhost podman[262664]: 2025-12-06 10:26:18.154073542 +0000 UTC m=+0.108328348 container create ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:26:18 localhost systemd[1]: Started libpod-conmon-ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e.scope. Dec 6 05:26:18 localhost podman[262664]: 2025-12-06 10:26:18.105601458 +0000 UTC m=+0.059856284 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:26:18 localhost systemd[1]: Started libcrun container. Dec 6 05:26:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc7138826322603f0d8d2fe1bf7362ee38ef52b0b66b8d5cf7ffd602e212ae95/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:26:18 localhost podman[262664]: 2025-12-06 10:26:18.227413301 +0000 UTC m=+0.181668117 container init ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:18 localhost nova_compute[237281]: 2025-12-06 10:26:18.232 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:18 localhost podman[262664]: 2025-12-06 10:26:18.236228762 +0000 UTC m=+0.190483568 container start ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:18 localhost dnsmasq[262683]: started, version 2.85 cachesize 150 Dec 6 05:26:18 localhost dnsmasq[262683]: DNS service limited to local subnets Dec 6 05:26:18 localhost dnsmasq[262683]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:26:18 localhost dnsmasq[262683]: warning: no upstream servers configured Dec 6 05:26:18 localhost dnsmasq-dhcp[262683]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:26:18 localhost dnsmasq[262683]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/addn_hosts - 0 addresses Dec 6 05:26:18 localhost dnsmasq-dhcp[262683]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/host Dec 6 05:26:18 localhost dnsmasq-dhcp[262683]: read /var/lib/neutron/dhcp/f44bdb12-81c1-44f8-a5db-842c2a85fd29/opts Dec 6 05:26:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:18.487 219384 INFO neutron.agent.dhcp.agent [None req-a022a72e-e98e-41e4-ac09-87a28a3ca317 - - - - - -] DHCP configuration for ports {'cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de', '4bd798ac-b853-49a9-b1c3-d20c04e752f1'} is completed#033[00m Dec 6 05:26:18 localhost dnsmasq[262683]: exiting on receipt of SIGTERM Dec 6 05:26:18 localhost podman[262701]: 2025-12-06 10:26:18.719395013 +0000 UTC m=+0.061666651 container kill ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:18 localhost systemd[1]: libpod-ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e.scope: Deactivated successfully. Dec 6 05:26:18 localhost podman[262717]: 2025-12-06 10:26:18.797153778 +0000 UTC m=+0.054440298 container died ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:26:18 localhost podman[262717]: 2025-12-06 10:26:18.888286825 +0000 UTC m=+0.145573285 container remove ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f44bdb12-81c1-44f8-a5db-842c2a85fd29, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:26:18 localhost systemd[1]: libpod-conmon-ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e.scope: Deactivated successfully. Dec 6 05:26:18 localhost nova_compute[237281]: 2025-12-06 10:26:18.899 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:18 localhost kernel: device tapcbf4fe17-83 left promiscuous mode Dec 6 05:26:18 localhost ovn_controller[131684]: 2025-12-06T10:26:18Z|00412|binding|INFO|Releasing lport cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de from this chassis (sb_readonly=0) Dec 6 05:26:18 localhost ovn_controller[131684]: 2025-12-06T10:26:18Z|00413|binding|INFO|Setting lport cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de down in Southbound Dec 6 05:26:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:18.910 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f44bdb12-81c1-44f8-a5db-842c2a85fd29', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f44bdb12-81c1-44f8-a5db-842c2a85fd29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f53d29120b44434860c4dafb30d2afc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d351122d-d4c7-44a8-9afa-63d8054d673b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:18.912 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cbf4fe17-83e2-4dfe-b81b-2ba1b56bd6de in datapath f44bdb12-81c1-44f8-a5db-842c2a85fd29 unbound from our chassis#033[00m Dec 6 05:26:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:18.915 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f44bdb12-81c1-44f8-a5db-842c2a85fd29, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:26:18 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:18.915 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6f231b2b-5e93-4d6f-84bb-33f137bc8349]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:18 localhost nova_compute[237281]: 2025-12-06 10:26:18.919 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:18 localhost systemd[1]: var-lib-containers-storage-overlay-fc7138826322603f0d8d2fe1bf7362ee38ef52b0b66b8d5cf7ffd602e212ae95-merged.mount: Deactivated successfully. Dec 6 05:26:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee2e383d738a8cd88c11a3011bf2dbde8c17fa553bdf7c750637292fff86141e-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:19.177 219384 INFO neutron.agent.dhcp.agent [None req-15b8cefe-8abb-4ba5-9124-89f42c667961 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:19 localhost systemd[1]: run-netns-qdhcp\x2df44bdb12\x2d81c1\x2d44f8\x2da5db\x2d842c2a85fd29.mount: Deactivated successfully. Dec 6 05:26:19 localhost nova_compute[237281]: 2025-12-06 10:26:19.274 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:21 localhost nova_compute[237281]: 2025-12-06 10:26:21.475 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:21 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:21.733 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:22 localhost neutron_sriov_agent[212548]: 2025-12-06 10:26:22.522 2 INFO neutron.agent.securitygroups_rpc [None req-ebc16e54-b622-4273-a4c2-0e4c8c372ba2 2e2429a207314407a8b8572df7927778 da648b133d3c45bcadbe795093c0fa10 - - default default] Security group member updated ['3b5cc3fc-91fe-4c82-a4a4-0b99a3df845f']#033[00m Dec 6 05:26:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:22.996 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:26:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:22.996 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.020 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 20660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36d38c52-e1d2-4a06-859e-7deb14c4a127', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20660000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:26:22.997122', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '03401964-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.230067876, 'message_signature': 'e3dc2da52c63c0e56204c53b46b48a76d9c90a5398748c38af12e4bc22e20087'}]}, 'timestamp': '2025-12-06 10:26:23.021023', '_unique_id': '1aeaa9f8abe0410e81e60037c89b36a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.022 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.027 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84719c8f-c6b1-4778-9249-32f49b865758', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.024130', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0341264c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': '2c17689021392e531c1bd05262d375c250db2121f63da968c061caf9eb9d2ac2'}]}, 'timestamp': '2025-12-06 10:26:23.027837', '_unique_id': '4854e13bd07e4ad28d9a54178abd3dab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.028 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.030 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.075 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.076 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51c310ce-c90d-45b1-ace0-ac7c68de0ed7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.030205', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '034882e8-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '9758d44702703128834a503a5c4affb12e2499ace78e3aa4476c7635d9932a3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.030205', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03489684-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '6302c44318bde2224ed2c7bef6f31412d2adf7dc195b88bf89ebfc2159d05bd0'}]}, 'timestamp': '2025-12-06 10:26:23.076552', '_unique_id': 'e5f62122a5ae49d295bcf739a50d28c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.077 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.096 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45eb7561-8732-401c-b42d-14058cf5ec0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.079357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '034bbe68-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.289472007, 'message_signature': '1f03d60b79ccc84abfff83b46de77597fbbff03c0f7b227759a285d82db07404'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.079357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '034bd07e-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.289472007, 'message_signature': 'ff7e832744106195488e1afed1ab7e8f0e637befad49c25fe49046ea8f9777b2'}]}, 'timestamp': '2025-12-06 10:26:23.097665', '_unique_id': '108c9861175b4d2c856caf1f20a99132'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.098 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0509b1f3-03e5-41de-938e-8b6c463a8536', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.100104', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '034c4144-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': '91fb91cef074a170b46ec44de6988cb6692337c751e405ad238181787c02d615'}]}, 'timestamp': '2025-12-06 10:26:23.100609', '_unique_id': 'eb47567a93a64025811eb6e82392bf60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.102 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.103 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc183172-97d3-47f6-a975-edcd9343f788', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.102897', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '034cae36-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '05066b5c4295cd15b5c2eea85b5cf3d45f3d5abc52815cdaa2d1ea40d2e9d346'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.102897', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '034cbf8e-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '9510e24f3aa37203d617347396ff4178ad65d4f7a8aec603b394c0fa1a98570d'}]}, 'timestamp': '2025-12-06 10:26:23.103790', '_unique_id': '483ab21a96924d788442c4e1f1da2861'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.104 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.105 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.106 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a62908ed-c12a-4053-a7c1-da6339d0de44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:26:23.106099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '034d2b2c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.230067876, 'message_signature': 'cfcf3a7f1ecb77f65fce8122f654ed2d3619f0575e7fcf0cb5ac671b76dc2285'}]}, 'timestamp': '2025-12-06 10:26:23.106555', '_unique_id': 'b2fcb058d60e4340ba1e8b4d43d8d4f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.107 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eb27ab1-1b6c-4f49-9e81-742847199270', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.108975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '034d9b52-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.289472007, 'message_signature': 'c60f03fed4c1bd6e6d0b5a2c811f3baa172f6d0a0b4b232d895d768ffd1d1a78'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.108975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '034dad0e-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.289472007, 'message_signature': 'd29e3871f8eb4815d6e5ca069dd2464c515dc3c7a95f1d2f75064577e14e9c2f'}]}, 'timestamp': '2025-12-06 10:26:23.109924', '_unique_id': '17d269d4ce264983805e26fc0ff19967'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f47462-b2bd-446c-bf28-0cf506da7a94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.112324', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '034e1e1a-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '327aa35070e90ad1150c9a6f1f064a881bee31b433d736b5a7440d8a4f9fc5b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.112324', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '034e30a8-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '8b84ae968b4e28e41d024eee11d40eef690a12bc33d85b443059b1706c90d8c6'}]}, 'timestamp': '2025-12-06 10:26:23.113239', '_unique_id': '63e9a76fabba4831b4c1e63ce75a0d67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d7a7cce-ea05-4d1b-b83e-87101f0a21e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.115516', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '034e9ade-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': 'e0cf4e2f2e88977719d11a9f3f6e687dbf971b413208fb37ae1d1d32e54c6c13'}]}, 'timestamp': '2025-12-06 10:26:23.116042', '_unique_id': 'f494209b81b545a5817f754caf500875'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.116 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.118 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.118 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25fa2b08-0367-4a0c-a07e-790728a013bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.118497', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '034f0faa-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': 'fb29e8e744c8f046248d51633c59235b899caab2c08bd3cedf492fd932e016cc'}]}, 'timestamp': '2025-12-06 10:26:23.119039', '_unique_id': 'd6b6cc04abcd4123ab35db15536ace77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.121 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf0b5de5-4271-4f29-a5ff-59264bf70e32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.121485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '034f8476-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.289472007, 'message_signature': '72d98a1c541d18a98a15ae0aadaf9f7b7b54571e98a9f7916afb3e18bd113293'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.121485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '034f9844-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.289472007, 'message_signature': '790c39fba0759cfe87191efcc3d245956bec2c5ac89fc0bce1b81d6dce1b8aa7'}]}, 'timestamp': '2025-12-06 10:26:23.122448', '_unique_id': '546575865a33470d87f54577813121aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.123 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.124 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a3e3dad-e93b-4453-b9cc-2fc364402aba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.124771', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '03500608-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': '717c40e18cb44420cc68c15798b9bde5bef5773ad6fa116baa51edaec48e1dd1'}]}, 'timestamp': '2025-12-06 10:26:23.125306', '_unique_id': 'a0eb7df066364394afb6a918b57c29f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.126 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.127 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e23ad29-30fb-4656-8618-9688b5e2f636', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.127634', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0350757a-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': '3453415945ab268ea7c674601132bc5c0539994f285ba71239fcbfa0b256dd00'}]}, 'timestamp': '2025-12-06 10:26:23.128138', '_unique_id': '60e514149c84488c8ae34a965ffe90cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.129 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.130 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0e445f2-9e24-49d6-ba8f-64086643abfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.130636', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0350eaf0-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': '3b53d63f0bc973807b7df530940ec1852b1eeaa354b63f21bfe1128a3a65f1b8'}]}, 'timestamp': '2025-12-06 10:26:23.131145', '_unique_id': '3594e9612a284a40b5d8149ff6358bdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.133 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61da9d38-ada5-4b7d-9f00-ac47019c1c3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.133425', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '035158f0-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': '1fc22c2e7b5bf56a41e38e129ceae036f8ec1bb1e2bdc671c89a6205ea294b88'}]}, 'timestamp': '2025-12-06 10:26:23.133993', '_unique_id': '1e20d18fc9044aa8a99704acf4338d0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.134 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.136 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.137 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bb3378f-b577-45ca-bed0-7a44b17adb83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.136478', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0351cf92-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '764bfe7b811f382038f1b9342a59d27b638de3caf06def31ddaf6b15960bf26c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.136478', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0351e1e4-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '7c9c2c7d9c708906a35f5e04b320a7483957f67b82cfcb6a421572781045e907'}]}, 'timestamp': '2025-12-06 10:26:23.137468', '_unique_id': '6e9ab82660b6475ca360b0926ccc37e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.138 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.139 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.140 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9068cd3f-cb4e-46f1-918a-2a02b1e4b47c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.139825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03525408-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': 'e3dfbcce379413b7433c2044e0f31fc562cade48ef97150f05a73d7daec8e7df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.139825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0352648e-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '5509db7280a9d5a17ac613052ea73440d11af82cd31c08eeff6c12b08bcb7ef9'}]}, 'timestamp': '2025-12-06 10:26:23.140786', '_unique_id': '0bee14330bcc4c3b872423f16364d2d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5af5e772-1783-4ac2-981a-50c46801b373', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.143083', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '0352cc80-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': 'e477558bf8d3abd85e9be94674854fe456c122b558268cec24ae05ff70091d41'}]}, 'timestamp': '2025-12-06 10:26:23.143381', '_unique_id': '3a7eb18fd666445bb9c3d4eb90ef7bb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.143 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.144 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57f36d92-da34-4938-ba22-6bd7e10d566d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:26:23.144698', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '03530c2c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.234240505, 'message_signature': 'cb1b0325fb1be144572497261f6eea11e30991fce5c523f7287b04fa5b6c2bb9'}]}, 'timestamp': '2025-12-06 10:26:23.145012', '_unique_id': 'b9be83fe76c24afba3ef1ef2e7c127b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.146 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.146 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16395ee3-5974-4f7e-9a40-983bfdeaf4eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:26:23.146379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '035351aa-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': '0ca67d088f7e9175da53282d7575ec2990b0f9250c80b62e400184c84536b212'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:26:23.146379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03535d1c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13170.240332283, 'message_signature': 'ac04454c6b1876376e97606b2283252d0ed8092d193d0db499ef87b4713006d8'}]}, 'timestamp': '2025-12-06 10:26:23.147063', '_unique_id': 'cd4a0286c7d0407a9fae5a2b50521c20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:26:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:26:23.148 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:26:23 localhost podman[197801]: time="2025-12-06T10:26:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:26:23 localhost podman[197801]: @ - - [06/Dec/2025:10:26:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149619 "" "Go-http-client/1.1" Dec 6 05:26:23 localhost podman[197801]: @ - - [06/Dec/2025:10:26:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17409 "" "Go-http-client/1.1" Dec 6 05:26:23 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:23.453 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21504 DF PROTO=TCP SPT=56394 DPT=9102 SEQ=1756271886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE10C890000000001030307) Dec 6 05:26:24 localhost nova_compute[237281]: 2025-12-06 10:26:24.275 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:24 localhost ovn_controller[131684]: 2025-12-06T10:26:24Z|00414|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:24 localhost nova_compute[237281]: 2025-12-06 10:26:24.399 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:24 localhost sshd[262742]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:26:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21505 DF PROTO=TCP SPT=56394 DPT=9102 SEQ=1756271886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE110870000000001030307) Dec 6 05:26:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:26:25 localhost podman[262743]: 2025-12-06 10:26:25.170689856 +0000 UTC m=+0.094374047 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 05:26:25 localhost podman[262743]: 2025-12-06 10:26:25.181218901 +0000 UTC m=+0.104903112 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.) Dec 6 05:26:25 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:26:25 localhost nova_compute[237281]: 2025-12-06 10:26:25.707 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44545 DF PROTO=TCP SPT=35296 DPT=9102 SEQ=3744744917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE113870000000001030307) Dec 6 05:26:26 localhost nova_compute[237281]: 2025-12-06 10:26:26.478 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21506 DF PROTO=TCP SPT=56394 DPT=9102 SEQ=1756271886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE118880000000001030307) Dec 6 05:26:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42672 DF PROTO=TCP SPT=54696 DPT=9102 SEQ=1458137816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE11B880000000001030307) Dec 6 05:26:29 localhost nova_compute[237281]: 2025-12-06 10:26:29.276 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:26:29 localhost podman[262764]: 2025-12-06 10:26:29.547643002 +0000 UTC m=+0.084254725 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:26:29 localhost podman[262764]: 2025-12-06 10:26:29.555207025 +0000 UTC m=+0.091818698 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:26:29 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:26:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21507 DF PROTO=TCP SPT=56394 DPT=9102 SEQ=1756271886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE128470000000001030307) Dec 6 05:26:31 localhost dnsmasq[259619]: exiting on receipt of SIGTERM Dec 6 05:26:31 localhost podman[262804]: 2025-12-06 10:26:31.457966719 +0000 UTC m=+0.071088572 container kill 44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e4cbe437-288a-4b4c-96af-5e8fd0701fca, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:31 localhost systemd[1]: libpod-44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd.scope: Deactivated successfully. Dec 6 05:26:31 localhost ovn_controller[131684]: 2025-12-06T10:26:31Z|00415|binding|INFO|Removing iface tapeee76c74-b6 ovn-installed in OVS Dec 6 05:26:31 localhost ovn_controller[131684]: 2025-12-06T10:26:31Z|00416|binding|INFO|Removing lport eee76c74-b6d4-4e9f-a0de-94e513d37252 ovn-installed in OVS Dec 6 05:26:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:31.479 137259 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 651e99d3-83ee-439e-8c5e-562cad43139d with type ""#033[00m Dec 6 05:26:31 localhost nova_compute[237281]: 2025-12-06 10:26:31.480 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:31 localhost nova_compute[237281]: 2025-12-06 10:26:31.483 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:31.484 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-e4cbe437-288a-4b4c-96af-5e8fd0701fca', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e4cbe437-288a-4b4c-96af-5e8fd0701fca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f53d29120b44434860c4dafb30d2afc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=84ef3307-1b37-4f06-b442-7886961b9f5a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eee76c74-b6d4-4e9f-a0de-94e513d37252) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:31.487 137259 INFO neutron.agent.ovn.metadata.agent [-] Port eee76c74-b6d4-4e9f-a0de-94e513d37252 in datapath e4cbe437-288a-4b4c-96af-5e8fd0701fca unbound from our chassis#033[00m Dec 6 05:26:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:31.488 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e4cbe437-288a-4b4c-96af-5e8fd0701fca or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:26:31 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:31.488 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1fd17c-1047-4850-9d96-4bf9b3293605]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:31 localhost podman[262820]: 2025-12-06 10:26:31.520098742 +0000 UTC m=+0.043736458 container died 44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e4cbe437-288a-4b4c-96af-5e8fd0701fca, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 6 05:26:31 localhost systemd[1]: tmp-crun.6BCGm1.mount: Deactivated successfully. Dec 6 05:26:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:31 localhost podman[262820]: 2025-12-06 10:26:31.570097182 +0000 UTC m=+0.093734828 container remove 44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e4cbe437-288a-4b4c-96af-5e8fd0701fca, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:26:31 localhost nova_compute[237281]: 2025-12-06 10:26:31.581 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:31 localhost kernel: device tapeee76c74-b6 left promiscuous mode Dec 6 05:26:31 localhost nova_compute[237281]: 2025-12-06 10:26:31.603 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:31 localhost systemd[1]: libpod-conmon-44f1529042ccfabe9bb5e1efb4aed30f81aa1758143902cc07b40494976678dd.scope: Deactivated successfully. Dec 6 05:26:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:31.629 219384 INFO neutron.agent.dhcp.agent [None req-1c0de00c-e129-490c-aa39-fb13bd252939 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:31.889 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:32 localhost systemd[1]: var-lib-containers-storage-overlay-e6d5183432df8ebd12795f2559dae0862b1ba50f4e0f5c399b1843fda3f4f86f-merged.mount: Deactivated successfully. Dec 6 05:26:32 localhost systemd[1]: run-netns-qdhcp\x2de4cbe437\x2d288a\x2d4b4c\x2d96af\x2d5e8fd0701fca.mount: Deactivated successfully. Dec 6 05:26:32 localhost ovn_controller[131684]: 2025-12-06T10:26:32Z|00417|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:32 localhost nova_compute[237281]: 2025-12-06 10:26:32.560 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:34 localhost nova_compute[237281]: 2025-12-06 10:26:34.278 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:34 localhost dnsmasq[259243]: exiting on receipt of SIGTERM Dec 6 05:26:34 localhost podman[262863]: 2025-12-06 10:26:34.663262287 +0000 UTC m=+0.061339410 container kill f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74c85386-9889-4f05-8ee1-7bc7c702326e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:34 localhost systemd[1]: libpod-f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f.scope: Deactivated successfully. Dec 6 05:26:34 localhost podman[262877]: 2025-12-06 10:26:34.729942801 +0000 UTC m=+0.045301187 container died f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74c85386-9889-4f05-8ee1-7bc7c702326e, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:26:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:34 localhost systemd[1]: var-lib-containers-storage-overlay-39304fa0c7f6a0227c4e5f6ec3b2f8d33274bf967834989817b323cb7c1ef7d8-merged.mount: Deactivated successfully. Dec 6 05:26:34 localhost podman[262877]: 2025-12-06 10:26:34.776110022 +0000 UTC m=+0.091468388 container remove f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74c85386-9889-4f05-8ee1-7bc7c702326e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:26:34 localhost systemd[1]: libpod-conmon-f4e5c3a29b410638213904a5bd0ec020ec895a21c4a759f3bf687fb85cb2a28f.scope: Deactivated successfully. Dec 6 05:26:34 localhost nova_compute[237281]: 2025-12-06 10:26:34.789 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:34 localhost kernel: device tap86e7098a-89 left promiscuous mode Dec 6 05:26:34 localhost ovn_controller[131684]: 2025-12-06T10:26:34Z|00418|binding|INFO|Releasing lport 86e7098a-8913-455b-92b1-8787e15b4614 from this chassis (sb_readonly=0) Dec 6 05:26:34 localhost ovn_controller[131684]: 2025-12-06T10:26:34Z|00419|binding|INFO|Setting lport 86e7098a-8913-455b-92b1-8787e15b4614 down in Southbound Dec 6 05:26:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:34.804 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-74c85386-9889-4f05-8ee1-7bc7c702326e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74c85386-9889-4f05-8ee1-7bc7c702326e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f53d29120b44434860c4dafb30d2afc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=838dad69-b153-4634-a903-eaada7d7209e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=86e7098a-8913-455b-92b1-8787e15b4614) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:34.806 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 86e7098a-8913-455b-92b1-8787e15b4614 in datapath 74c85386-9889-4f05-8ee1-7bc7c702326e unbound from our chassis#033[00m Dec 6 05:26:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:34.807 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 74c85386-9889-4f05-8ee1-7bc7c702326e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:26:34 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:34.808 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb86d5c-7cbd-48c6-a8c2-9203f3336091]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:34 localhost nova_compute[237281]: 2025-12-06 10:26:34.810 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:35.185 219384 INFO neutron.agent.dhcp.agent [None req-ab714611-64cf-478b-9a2f-c308d96d8a05 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:35.186 219384 INFO neutron.agent.dhcp.agent [None req-ab714611-64cf-478b-9a2f-c308d96d8a05 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:35 localhost systemd[1]: run-netns-qdhcp\x2d74c85386\x2d9889\x2d4f05\x2d8ee1\x2d7bc7c702326e.mount: Deactivated successfully. Dec 6 05:26:35 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:35.569 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:35 localhost ovn_controller[131684]: 2025-12-06T10:26:35Z|00420|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:35 localhost nova_compute[237281]: 2025-12-06 10:26:35.936 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:26:36 localhost nova_compute[237281]: 2025-12-06 10:26:36.486 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:36 localhost podman[262902]: 2025-12-06 10:26:36.554540867 +0000 UTC m=+0.090126607 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:26:36 localhost podman[262902]: 2025-12-06 10:26:36.597779848 +0000 UTC m=+0.133365548 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 05:26:36 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:26:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:37.857 219384 INFO neutron.agent.linux.ip_lib [None req-d1ce633e-3519-4cd9-81d8-736d5e5233c8 - - - - - -] Device tapa82d853b-2c cannot be used as it has no MAC address#033[00m Dec 6 05:26:37 localhost nova_compute[237281]: 2025-12-06 10:26:37.924 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:37 localhost kernel: device tapa82d853b-2c entered promiscuous mode Dec 6 05:26:37 localhost NetworkManager[5965]: [1765016797.9354] manager: (tapa82d853b-2c): new Generic device (/org/freedesktop/NetworkManager/Devices/70) Dec 6 05:26:37 localhost nova_compute[237281]: 2025-12-06 10:26:37.936 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:37 localhost ovn_controller[131684]: 2025-12-06T10:26:37Z|00421|binding|INFO|Claiming lport a82d853b-2c31-440a-be76-caee0ec53be8 for this chassis. Dec 6 05:26:37 localhost ovn_controller[131684]: 2025-12-06T10:26:37Z|00422|binding|INFO|a82d853b-2c31-440a-be76-caee0ec53be8: Claiming unknown Dec 6 05:26:37 localhost systemd-udevd[262937]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:26:37 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:37 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:37 localhost ovn_controller[131684]: 2025-12-06T10:26:37Z|00423|binding|INFO|Setting lport a82d853b-2c31-440a-be76-caee0ec53be8 ovn-installed in OVS Dec 6 05:26:37 localhost ovn_controller[131684]: 2025-12-06T10:26:37Z|00424|binding|INFO|Setting lport a82d853b-2c31-440a-be76-caee0ec53be8 up in Southbound Dec 6 05:26:37 localhost nova_compute[237281]: 2025-12-06 10:26:37.975 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:37 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:37 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:37.988 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-50f2b3b5-9097-4253-81de-7262f333580b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50f2b3b5-9097-4253-81de-7262f333580b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=98bba69e-8154-4db6-b163-b0e314faaf12, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a82d853b-2c31-440a-be76-caee0ec53be8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:37.990 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a82d853b-2c31-440a-be76-caee0ec53be8 in datapath 50f2b3b5-9097-4253-81de-7262f333580b bound to our chassis#033[00m Dec 6 05:26:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:37.992 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 50f2b3b5-9097-4253-81de-7262f333580b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:26:37 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:37.994 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7e26af-1c32-46f2-a4d2-a7a2eafadbeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:37 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:38 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:38 localhost journal[186952]: ethtool ioctl error on tapa82d853b-2c: No such device Dec 6 05:26:38 localhost nova_compute[237281]: 2025-12-06 10:26:38.033 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:38 localhost nova_compute[237281]: 2025-12-06 10:26:38.072 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:39 localhost nova_compute[237281]: 2025-12-06 10:26:39.280 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21508 DF PROTO=TCP SPT=56394 DPT=9102 SEQ=1756271886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE149880000000001030307) Dec 6 05:26:39 localhost podman[263008]: Dec 6 05:26:39 localhost podman[263008]: 2025-12-06 10:26:39.802750267 +0000 UTC m=+0.076364542 container create 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:39 localhost podman[263008]: 2025-12-06 10:26:39.762327503 +0000 UTC m=+0.035941768 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:26:39 localhost systemd[1]: Started libpod-conmon-73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec.scope. Dec 6 05:26:39 localhost systemd[1]: tmp-crun.oH9zug.mount: Deactivated successfully. Dec 6 05:26:39 localhost systemd[1]: Started libcrun container. Dec 6 05:26:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ea643acb18264fc307add765ffa90b9e5bee6524aad94a3e24553599d927f47/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:26:39 localhost podman[263008]: 2025-12-06 10:26:39.89732519 +0000 UTC m=+0.170939465 container init 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:26:39 localhost podman[263008]: 2025-12-06 10:26:39.907985159 +0000 UTC m=+0.181599424 container start 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:39 localhost dnsmasq[263026]: started, version 2.85 cachesize 150 Dec 6 05:26:39 localhost dnsmasq[263026]: DNS service limited to local subnets Dec 6 05:26:39 localhost dnsmasq[263026]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:26:39 localhost dnsmasq[263026]: warning: no upstream servers configured Dec 6 05:26:39 localhost dnsmasq-dhcp[263026]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:26:39 localhost dnsmasq[263026]: read /var/lib/neutron/dhcp/50f2b3b5-9097-4253-81de-7262f333580b/addn_hosts - 0 addresses Dec 6 05:26:39 localhost dnsmasq-dhcp[263026]: read /var/lib/neutron/dhcp/50f2b3b5-9097-4253-81de-7262f333580b/host Dec 6 05:26:39 localhost dnsmasq-dhcp[263026]: read /var/lib/neutron/dhcp/50f2b3b5-9097-4253-81de-7262f333580b/opts Dec 6 05:26:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:26:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:26:40 localhost podman[263027]: 2025-12-06 10:26:40.558099342 +0000 UTC m=+0.083878325 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:26:40 localhost podman[263027]: 2025-12-06 10:26:40.572291479 +0000 UTC m=+0.098070452 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:26:40 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:26:40 localhost podman[263028]: 2025-12-06 10:26:40.661059953 +0000 UTC m=+0.184353459 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:40 localhost podman[263028]: 2025-12-06 10:26:40.676410665 +0000 UTC m=+0.199704151 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:26:40 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:26:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:40.857 219384 INFO neutron.agent.dhcp.agent [None req-698e7654-7356-4378-90e1-b6402e914514 - - - - - -] DHCP configuration for ports {'d8531e19-e82a-49f2-8e4b-726bbee9bfb3'} is completed#033[00m Dec 6 05:26:41 localhost nova_compute[237281]: 2025-12-06 10:26:41.489 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:42 localhost dnsmasq[263026]: read /var/lib/neutron/dhcp/50f2b3b5-9097-4253-81de-7262f333580b/addn_hosts - 0 addresses Dec 6 05:26:42 localhost dnsmasq-dhcp[263026]: read /var/lib/neutron/dhcp/50f2b3b5-9097-4253-81de-7262f333580b/host Dec 6 05:26:42 localhost podman[263083]: 2025-12-06 10:26:42.176299761 +0000 UTC m=+0.050867198 container kill 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:26:42 localhost dnsmasq-dhcp[263026]: read /var/lib/neutron/dhcp/50f2b3b5-9097-4253-81de-7262f333580b/opts Dec 6 05:26:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:42.532 219384 INFO neutron.agent.dhcp.agent [None req-19e9a073-d93f-40d9-850a-ce705311a0b4 - - - - - -] DHCP configuration for ports {'a82d853b-2c31-440a-be76-caee0ec53be8', 'd8531e19-e82a-49f2-8e4b-726bbee9bfb3'} is completed#033[00m Dec 6 05:26:42 localhost ovn_controller[131684]: 2025-12-06T10:26:42Z|00425|binding|INFO|Removing iface tapa82d853b-2c ovn-installed in OVS Dec 6 05:26:42 localhost ovn_controller[131684]: 2025-12-06T10:26:42Z|00426|binding|INFO|Removing lport a82d853b-2c31-440a-be76-caee0ec53be8 ovn-installed in OVS Dec 6 05:26:42 localhost nova_compute[237281]: 2025-12-06 10:26:42.919 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:42 localhost nova_compute[237281]: 2025-12-06 10:26:42.923 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:42.924 137259 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port dc05bd40-3177-4704-bc50-8b1077a59ebd with type ""#033[00m Dec 6 05:26:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:42.926 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-50f2b3b5-9097-4253-81de-7262f333580b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50f2b3b5-9097-4253-81de-7262f333580b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=98bba69e-8154-4db6-b163-b0e314faaf12, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a82d853b-2c31-440a-be76-caee0ec53be8) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:42.927 137259 INFO neutron.agent.ovn.metadata.agent [-] Port a82d853b-2c31-440a-be76-caee0ec53be8 in datapath 50f2b3b5-9097-4253-81de-7262f333580b unbound from our chassis#033[00m Dec 6 05:26:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:42.930 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50f2b3b5-9097-4253-81de-7262f333580b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:26:42 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:42.931 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b59af48a-eb75-45d0-b4e4-af50b00c0dd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:43 localhost dnsmasq[263026]: exiting on receipt of SIGTERM Dec 6 05:26:43 localhost podman[263123]: 2025-12-06 10:26:43.061169314 +0000 UTC m=+0.066221270 container kill 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:26:43 localhost systemd[1]: libpod-73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec.scope: Deactivated successfully. Dec 6 05:26:43 localhost podman[263136]: 2025-12-06 10:26:43.134106021 +0000 UTC m=+0.054044806 container died 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:26:43 localhost podman[263136]: 2025-12-06 10:26:43.166342913 +0000 UTC m=+0.086281678 container cleanup 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:43 localhost systemd[1]: libpod-conmon-73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec.scope: Deactivated successfully. Dec 6 05:26:43 localhost systemd[1]: var-lib-containers-storage-overlay-6ea643acb18264fc307add765ffa90b9e5bee6524aad94a3e24553599d927f47-merged.mount: Deactivated successfully. Dec 6 05:26:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:43 localhost podman[263138]: 2025-12-06 10:26:43.21914063 +0000 UTC m=+0.128770977 container remove 73dc1335d00aa1204374dad6bdfcd53332b158451eb2595d9afb228a120ed9ec (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f2b3b5-9097-4253-81de-7262f333580b, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:26:43 localhost nova_compute[237281]: 2025-12-06 10:26:43.232 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:43 localhost kernel: device tapa82d853b-2c left promiscuous mode Dec 6 05:26:43 localhost nova_compute[237281]: 2025-12-06 10:26:43.250 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:43.273 219384 INFO neutron.agent.dhcp.agent [None req-39e490fd-d7cb-4139-8665-223b725de614 - - - - - -] Synchronizing state#033[00m Dec 6 05:26:43 localhost systemd[1]: run-netns-qdhcp\x2d50f2b3b5\x2d9097\x2d4253\x2d81de\x2d7262f333580b.mount: Deactivated successfully. Dec 6 05:26:43 localhost ovn_controller[131684]: 2025-12-06T10:26:43Z|00427|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:43.522 219384 INFO neutron.agent.dhcp.agent [None req-7ac8dc4e-0073-4b03-ba49-f59d1a486de0 - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 6 05:26:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:43.525 219384 INFO neutron.agent.dhcp.agent [-] Starting network 50f2b3b5-9097-4253-81de-7262f333580b dhcp configuration#033[00m Dec 6 05:26:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:43.526 219384 INFO neutron.agent.dhcp.agent [-] Finished network 50f2b3b5-9097-4253-81de-7262f333580b dhcp configuration#033[00m Dec 6 05:26:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:43.527 219384 INFO neutron.agent.dhcp.agent [None req-7ac8dc4e-0073-4b03-ba49-f59d1a486de0 - - - - - -] Synchronizing state complete#033[00m Dec 6 05:26:43 localhost nova_compute[237281]: 2025-12-06 10:26:43.538 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:43 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:43.556 219384 INFO neutron.agent.dhcp.agent [None req-dfcd22e7-69e6-408a-acbd-2f5e1f4fec30 - - - - - -] DHCP configuration for ports {'d8531e19-e82a-49f2-8e4b-726bbee9bfb3'} is completed#033[00m Dec 6 05:26:44 localhost nova_compute[237281]: 2025-12-06 10:26:44.282 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:26:45 localhost podman[263168]: 2025-12-06 10:26:45.560304755 +0000 UTC m=+0.088122916 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:26:45 localhost podman[263168]: 2025-12-06 10:26:45.596157138 +0000 UTC m=+0.123975319 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:26:45 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:26:45 localhost podman[263169]: 2025-12-06 10:26:45.62283049 +0000 UTC m=+0.144391378 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:26:45 localhost podman[263169]: 2025-12-06 10:26:45.664324019 +0000 UTC m=+0.185884887 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:26:45 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:26:46 localhost openstack_network_exporter[199751]: ERROR 10:26:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:26:46 localhost openstack_network_exporter[199751]: ERROR 10:26:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:26:46 localhost openstack_network_exporter[199751]: ERROR 10:26:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:26:46 localhost openstack_network_exporter[199751]: ERROR 10:26:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:26:46 localhost openstack_network_exporter[199751]: Dec 6 05:26:46 localhost openstack_network_exporter[199751]: ERROR 10:26:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:26:46 localhost openstack_network_exporter[199751]: Dec 6 05:26:46 localhost nova_compute[237281]: 2025-12-06 10:26:46.492 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:48 localhost neutron_sriov_agent[212548]: 2025-12-06 10:26:48.287 2 INFO neutron.agent.securitygroups_rpc [None req-1cd5be6e-088c-4bcf-8c4a-a55568830f32 c0e85d36c6044d9697993a40206d7832 78c73d786bd44858b8138f7bd26dbb60 - - default default] Security group member updated ['5ab7045d-e25f-45c1-856b-35518613c6cf']#033[00m Dec 6 05:26:48 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:48.548 219384 INFO neutron.agent.linux.ip_lib [None req-56a0efc2-c857-4c17-912b-44575ef04638 - - - - - -] Device tap461ccb2c-55 cannot be used as it has no MAC address#033[00m Dec 6 05:26:48 localhost nova_compute[237281]: 2025-12-06 10:26:48.609 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:48 localhost kernel: device tap461ccb2c-55 entered promiscuous mode Dec 6 05:26:48 localhost NetworkManager[5965]: [1765016808.6185] manager: (tap461ccb2c-55): new Generic device (/org/freedesktop/NetworkManager/Devices/71) Dec 6 05:26:48 localhost nova_compute[237281]: 2025-12-06 10:26:48.619 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:48 localhost ovn_controller[131684]: 2025-12-06T10:26:48Z|00428|binding|INFO|Claiming lport 461ccb2c-55a7-4297-af5b-1819809939e6 for this chassis. Dec 6 05:26:48 localhost ovn_controller[131684]: 2025-12-06T10:26:48Z|00429|binding|INFO|461ccb2c-55a7-4297-af5b-1819809939e6: Claiming unknown Dec 6 05:26:48 localhost systemd-udevd[263215]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:26:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:48.634 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feff:674d/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-845aa200-4a3e-4edd-8098-51f7b3728ba4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-845aa200-4a3e-4edd-8098-51f7b3728ba4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78c73d786bd44858b8138f7bd26dbb60', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab66d599-b489-427d-bb94-6ab2aee6b1cc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=461ccb2c-55a7-4297-af5b-1819809939e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:48.636 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 461ccb2c-55a7-4297-af5b-1819809939e6 in datapath 845aa200-4a3e-4edd-8098-51f7b3728ba4 bound to our chassis#033[00m Dec 6 05:26:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:48.639 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port d08c1aa8-c361-49ba-8a78-0f5026e863a1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:26:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:48.639 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 845aa200-4a3e-4edd-8098-51f7b3728ba4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:26:48 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:48.641 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c4331e-4413-4535-abec-410c809d520e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost nova_compute[237281]: 2025-12-06 10:26:48.668 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost ovn_controller[131684]: 2025-12-06T10:26:48Z|00430|binding|INFO|Setting lport 461ccb2c-55a7-4297-af5b-1819809939e6 ovn-installed in OVS Dec 6 05:26:48 localhost ovn_controller[131684]: 2025-12-06T10:26:48Z|00431|binding|INFO|Setting lport 461ccb2c-55a7-4297-af5b-1819809939e6 up in Southbound Dec 6 05:26:48 localhost nova_compute[237281]: 2025-12-06 10:26:48.673 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost journal[186952]: ethtool ioctl error on tap461ccb2c-55: No such device Dec 6 05:26:48 localhost nova_compute[237281]: 2025-12-06 10:26:48.709 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:48 localhost nova_compute[237281]: 2025-12-06 10:26:48.739 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:49 localhost nova_compute[237281]: 2025-12-06 10:26:49.284 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:49 localhost podman[263286]: Dec 6 05:26:49 localhost podman[263286]: 2025-12-06 10:26:49.550827528 +0000 UTC m=+0.097577236 container create c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:26:49 localhost systemd[1]: Started libpod-conmon-c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460.scope. Dec 6 05:26:49 localhost podman[263286]: 2025-12-06 10:26:49.504017236 +0000 UTC m=+0.050766984 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:26:49 localhost systemd[1]: Started libcrun container. Dec 6 05:26:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e78952d508f0c87d117d8c8819658a9acad0e7d242f6121960c7adb3c71996d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:26:49 localhost podman[263286]: 2025-12-06 10:26:49.630811762 +0000 UTC m=+0.177561500 container init c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:26:49 localhost podman[263286]: 2025-12-06 10:26:49.637319992 +0000 UTC m=+0.184069720 container start c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:49 localhost dnsmasq[263305]: started, version 2.85 cachesize 150 Dec 6 05:26:49 localhost dnsmasq[263305]: DNS service limited to local subnets Dec 6 05:26:49 localhost dnsmasq[263305]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:26:49 localhost dnsmasq[263305]: warning: no upstream servers configured Dec 6 05:26:49 localhost dnsmasq[263305]: read /var/lib/neutron/dhcp/845aa200-4a3e-4edd-8098-51f7b3728ba4/addn_hosts - 0 addresses Dec 6 05:26:49 localhost ovn_controller[131684]: 2025-12-06T10:26:49Z|00432|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:49.707 219384 INFO neutron.agent.dhcp.agent [None req-56a0efc2-c857-4c17-912b-44575ef04638 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:26:48Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8717f785-85f3-497d-9a50-bdb0ceb3d281, ip_allocation=immediate, mac_address=fa:16:3e:b7:25:b3, name=tempest-NetworksIpV6TestAttrs-609518020, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:26:43Z, description=, dns_domain=, id=845aa200-4a3e-4edd-8098-51f7b3728ba4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksIpV6TestAttrs-test-network-938623466, port_security_enabled=True, project_id=78c73d786bd44858b8138f7bd26dbb60, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7786, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2353, status=ACTIVE, subnets=['8f756617-f2d0-4a18-a9d2-54270b119d9c'], tags=[], tenant_id=78c73d786bd44858b8138f7bd26dbb60, updated_at=2025-12-06T10:26:45Z, vlan_transparent=None, network_id=845aa200-4a3e-4edd-8098-51f7b3728ba4, port_security_enabled=True, project_id=78c73d786bd44858b8138f7bd26dbb60, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5ab7045d-e25f-45c1-856b-35518613c6cf'], standard_attr_id=2366, status=DOWN, tags=[], tenant_id=78c73d786bd44858b8138f7bd26dbb60, updated_at=2025-12-06T10:26:48Z on network 845aa200-4a3e-4edd-8098-51f7b3728ba4#033[00m Dec 6 05:26:49 localhost nova_compute[237281]: 2025-12-06 10:26:49.773 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:49 localhost dnsmasq[263305]: read /var/lib/neutron/dhcp/845aa200-4a3e-4edd-8098-51f7b3728ba4/addn_hosts - 1 addresses Dec 6 05:26:49 localhost podman[263324]: 2025-12-06 10:26:49.895099402 +0000 UTC m=+0.048974030 container kill c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:26:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:49.907 219384 INFO neutron.agent.dhcp.agent [None req-6ab01411-aabf-475e-9390-d5798e0e423f - - - - - -] DHCP configuration for ports {'c26c91bb-ce6c-4761-82f2-c500b5aacc31'} is completed#033[00m Dec 6 05:26:50 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:50.195 219384 INFO neutron.agent.dhcp.agent [None req-3d292381-4fb4-4c58-9a1b-57ae514173ee - - - - - -] DHCP configuration for ports {'8717f785-85f3-497d-9a50-bdb0ceb3d281'} is completed#033[00m Dec 6 05:26:50 localhost dnsmasq[263305]: exiting on receipt of SIGTERM Dec 6 05:26:50 localhost podman[263364]: 2025-12-06 10:26:50.379949475 +0000 UTC m=+0.059314748 container kill c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:26:50 localhost systemd[1]: libpod-c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460.scope: Deactivated successfully. Dec 6 05:26:50 localhost podman[263376]: 2025-12-06 10:26:50.447101583 +0000 UTC m=+0.050590439 container died c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:26:50 localhost podman[263376]: 2025-12-06 10:26:50.480028637 +0000 UTC m=+0.083517493 container cleanup c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Dec 6 05:26:50 localhost systemd[1]: libpod-conmon-c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460.scope: Deactivated successfully. Dec 6 05:26:50 localhost podman[263378]: 2025-12-06 10:26:50.533181844 +0000 UTC m=+0.130750678 container remove c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-845aa200-4a3e-4edd-8098-51f7b3728ba4, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:26:50 localhost kernel: device tap461ccb2c-55 left promiscuous mode Dec 6 05:26:50 localhost ovn_controller[131684]: 2025-12-06T10:26:50Z|00433|binding|INFO|Releasing lport 461ccb2c-55a7-4297-af5b-1819809939e6 from this chassis (sb_readonly=0) Dec 6 05:26:50 localhost ovn_controller[131684]: 2025-12-06T10:26:50Z|00434|binding|INFO|Setting lport 461ccb2c-55a7-4297-af5b-1819809939e6 down in Southbound Dec 6 05:26:50 localhost nova_compute[237281]: 2025-12-06 10:26:50.546 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:50 localhost systemd[1]: var-lib-containers-storage-overlay-4e78952d508f0c87d117d8c8819658a9acad0e7d242f6121960c7adb3c71996d-merged.mount: Deactivated successfully. Dec 6 05:26:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c70283982ef7db5c7c277d3991d5693b94410518182f2c3a61762dad583b2460-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:50.563 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feff:674d/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-845aa200-4a3e-4edd-8098-51f7b3728ba4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-845aa200-4a3e-4edd-8098-51f7b3728ba4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78c73d786bd44858b8138f7bd26dbb60', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab66d599-b489-427d-bb94-6ab2aee6b1cc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=461ccb2c-55a7-4297-af5b-1819809939e6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:50.565 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 461ccb2c-55a7-4297-af5b-1819809939e6 in datapath 845aa200-4a3e-4edd-8098-51f7b3728ba4 unbound from our chassis#033[00m Dec 6 05:26:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:50.568 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 845aa200-4a3e-4edd-8098-51f7b3728ba4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:26:50 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:50.568 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ee41a9c1-b4bc-4738-bef8-75f24989db20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:50 localhost nova_compute[237281]: 2025-12-06 10:26:50.570 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:50 localhost nova_compute[237281]: 2025-12-06 10:26:50.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:51 localhost nova_compute[237281]: 2025-12-06 10:26:51.495 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:51 localhost nova_compute[237281]: 2025-12-06 10:26:51.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:52 localhost systemd[1]: run-netns-qdhcp\x2d845aa200\x2d4a3e\x2d4edd\x2d8098\x2d51f7b3728ba4.mount: Deactivated successfully. Dec 6 05:26:52 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:52.818 219384 INFO neutron.agent.dhcp.agent [None req-2b5d1eed-7c99-4911-b37b-1560ac21a402 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:53 localhost podman[197801]: time="2025-12-06T10:26:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:26:53 localhost podman[197801]: @ - - [06/Dec/2025:10:26:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 145978 "" "Go-http-client/1.1" Dec 6 05:26:53 localhost podman[197801]: @ - - [06/Dec/2025:10:26:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16449 "" "Go-http-client/1.1" Dec 6 05:26:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58530 DF PROTO=TCP SPT=32936 DPT=9102 SEQ=3636148708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE181B90000000001030307) Dec 6 05:26:54 localhost nova_compute[237281]: 2025-12-06 10:26:54.123 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:54 localhost nova_compute[237281]: 2025-12-06 10:26:54.124 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:54 localhost nova_compute[237281]: 2025-12-06 10:26:54.288 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58531 DF PROTO=TCP SPT=32936 DPT=9102 SEQ=3636148708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE185C70000000001030307) Dec 6 05:26:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:26:55 localhost podman[263408]: 2025-12-06 10:26:55.579112315 +0000 UTC m=+0.106949945 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64) Dec 6 05:26:55 localhost podman[263408]: 2025-12-06 10:26:55.597300304 +0000 UTC m=+0.125137934 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:26:55 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:26:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21509 DF PROTO=TCP SPT=56394 DPT=9102 SEQ=1756271886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE189870000000001030307) Dec 6 05:26:56 localhost nova_compute[237281]: 2025-12-06 10:26:56.498 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:56 localhost dnsmasq[259765]: exiting on receipt of SIGTERM Dec 6 05:26:56 localhost podman[263446]: 2025-12-06 10:26:56.828505386 +0000 UTC m=+0.091282613 container kill b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7e32f0-c159-422c-a63a-fa11068940d1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:26:56 localhost systemd[1]: libpod-b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571.scope: Deactivated successfully. Dec 6 05:26:56 localhost nova_compute[237281]: 2025-12-06 10:26:56.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:56 localhost podman[263459]: 2025-12-06 10:26:56.905797377 +0000 UTC m=+0.061999692 container died b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7e32f0-c159-422c-a63a-fa11068940d1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:26:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571-userdata-shm.mount: Deactivated successfully. Dec 6 05:26:56 localhost podman[263459]: 2025-12-06 10:26:56.943013293 +0000 UTC m=+0.099215568 container cleanup b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7e32f0-c159-422c-a63a-fa11068940d1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:26:56 localhost systemd[1]: libpod-conmon-b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571.scope: Deactivated successfully. Dec 6 05:26:56 localhost podman[263463]: 2025-12-06 10:26:56.997128219 +0000 UTC m=+0.140550240 container remove b89e5a4349cc10ede08ab5b467086334712d8a19a99071d60dcb9c36c84ca571 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7e32f0-c159-422c-a63a-fa11068940d1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:26:57 localhost kernel: device tapea684f74-23 left promiscuous mode Dec 6 05:26:57 localhost ovn_controller[131684]: 2025-12-06T10:26:57Z|00435|binding|INFO|Releasing lport ea684f74-2312-45ee-baba-757d986db059 from this chassis (sb_readonly=0) Dec 6 05:26:57 localhost ovn_controller[131684]: 2025-12-06T10:26:57Z|00436|binding|INFO|Setting lport ea684f74-2312-45ee-baba-757d986db059 down in Southbound Dec 6 05:26:57 localhost nova_compute[237281]: 2025-12-06 10:26:57.011 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:57 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:57.027 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-2c7e32f0-c159-422c-a63a-fa11068940d1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c7e32f0-c159-422c-a63a-fa11068940d1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6e5f2aeaf52490d9822161edabfbbe5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=09f56fbf-c4d1-4cbd-8dbf-48a2c2fccc3f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ea684f74-2312-45ee-baba-757d986db059) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:26:57 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:57.030 137259 INFO neutron.agent.ovn.metadata.agent [-] Port ea684f74-2312-45ee-baba-757d986db059 in datapath 2c7e32f0-c159-422c-a63a-fa11068940d1 unbound from our chassis#033[00m Dec 6 05:26:57 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:57.036 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c7e32f0-c159-422c-a63a-fa11068940d1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:26:57 localhost nova_compute[237281]: 2025-12-06 10:26:57.039 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:57 localhost ovn_metadata_agent[137254]: 2025-12-06 10:26:57.038 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[3a713806-3f28-4daf-a1a3-1437388a9afb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:26:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58532 DF PROTO=TCP SPT=32936 DPT=9102 SEQ=3636148708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE18DC80000000001030307) Dec 6 05:26:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:57.631 219384 INFO neutron.agent.dhcp.agent [None req-3eb771b6-68c2-43da-a603-71f87bf9d192 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:57.632 219384 INFO neutron.agent.dhcp.agent [None req-3eb771b6-68c2-43da-a603-71f87bf9d192 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:57 localhost systemd[1]: var-lib-containers-storage-overlay-bc4632441900b912bf5493acfd6fff09d8e9e876d50dae2187461bc19ba120f3-merged.mount: Deactivated successfully. Dec 6 05:26:57 localhost systemd[1]: run-netns-qdhcp\x2d2c7e32f0\x2dc159\x2d422c\x2da63a\x2dfa11068940d1.mount: Deactivated successfully. Dec 6 05:26:57 localhost nova_compute[237281]: 2025-12-06 10:26:57.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:57 localhost nova_compute[237281]: 2025-12-06 10:26:57.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:57 localhost nova_compute[237281]: 2025-12-06 10:26:57.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 05:26:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44546 DF PROTO=TCP SPT=35296 DPT=9102 SEQ=3744744917 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE191870000000001030307) Dec 6 05:26:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:26:58.293 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:26:58 localhost nova_compute[237281]: 2025-12-06 10:26:58.961 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:26:58 localhost nova_compute[237281]: 2025-12-06 10:26:58.962 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:26:59 localhost ovn_controller[131684]: 2025-12-06T10:26:59Z|00437|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:26:59 localhost nova_compute[237281]: 2025-12-06 10:26:59.081 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:26:59 localhost nova_compute[237281]: 2025-12-06 10:26:59.290 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:27:00 localhost podman[263489]: 2025-12-06 10:27:00.578602713 +0000 UTC m=+0.105209992 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:27:00 localhost podman[263489]: 2025-12-06 10:27:00.612286231 +0000 UTC m=+0.138893490 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:27:00 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:27:00 localhost nova_compute[237281]: 2025-12-06 10:27:00.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58533 DF PROTO=TCP SPT=32936 DPT=9102 SEQ=3636148708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE19D880000000001030307) Dec 6 05:27:01 localhost nova_compute[237281]: 2025-12-06 10:27:01.530 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:01 localhost nova_compute[237281]: 2025-12-06 10:27:01.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:01 localhost nova_compute[237281]: 2025-12-06 10:27:01.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:27:01 localhost nova_compute[237281]: 2025-12-06 10:27:01.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:27:02 localhost nova_compute[237281]: 2025-12-06 10:27:02.997 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:27:02 localhost nova_compute[237281]: 2025-12-06 10:27:02.998 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:27:02 localhost nova_compute[237281]: 2025-12-06 10:27:02.998 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:27:02 localhost nova_compute[237281]: 2025-12-06 10:27:02.998 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:27:04 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:04.034 2 INFO neutron.agent.securitygroups_rpc [None req-1ffc2002-cf79-4cf1-aa4a-be01f0decbb4 c0e85d36c6044d9697993a40206d7832 78c73d786bd44858b8138f7bd26dbb60 - - default default] Security group member updated ['5ab7045d-e25f-45c1-856b-35518613c6cf']#033[00m Dec 6 05:27:04 localhost nova_compute[237281]: 2025-12-06 10:27:04.293 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:06 localhost nova_compute[237281]: 2025-12-06 10:27:06.534 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:06.711 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:27:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:06.712 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:27:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:06.712 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:27:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:27:07 localhost podman[263511]: 2025-12-06 10:27:07.538062859 +0000 UTC m=+0.068005295 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:27:07 localhost podman[263511]: 2025-12-06 10:27:07.649503892 +0000 UTC m=+0.179446288 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 6 05:27:07 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:27:09 localhost nova_compute[237281]: 2025-12-06 10:27:09.297 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58534 DF PROTO=TCP SPT=32936 DPT=9102 SEQ=3636148708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE1BD870000000001030307) Dec 6 05:27:09 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:09.783 219384 INFO neutron.agent.linux.ip_lib [None req-f3fbac3b-39f4-4ef8-8c6c-8609e1f411b7 - - - - - -] Device tap0bf76b92-a0 cannot be used as it has no MAC address#033[00m Dec 6 05:27:09 localhost nova_compute[237281]: 2025-12-06 10:27:09.813 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:09 localhost kernel: device tap0bf76b92-a0 entered promiscuous mode Dec 6 05:27:09 localhost NetworkManager[5965]: [1765016829.8211] manager: (tap0bf76b92-a0): new Generic device (/org/freedesktop/NetworkManager/Devices/72) Dec 6 05:27:09 localhost nova_compute[237281]: 2025-12-06 10:27:09.821 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:09 localhost ovn_controller[131684]: 2025-12-06T10:27:09Z|00438|binding|INFO|Claiming lport 0bf76b92-a0bd-422d-af31-91d15aa2cd54 for this chassis. Dec 6 05:27:09 localhost ovn_controller[131684]: 2025-12-06T10:27:09Z|00439|binding|INFO|0bf76b92-a0bd-422d-af31-91d15aa2cd54: Claiming unknown Dec 6 05:27:09 localhost systemd-udevd[263546]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost ovn_controller[131684]: 2025-12-06T10:27:09Z|00440|binding|INFO|Setting lport 0bf76b92-a0bd-422d-af31-91d15aa2cd54 ovn-installed in OVS Dec 6 05:27:09 localhost nova_compute[237281]: 2025-12-06 10:27:09.866 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost journal[186952]: ethtool ioctl error on tap0bf76b92-a0: No such device Dec 6 05:27:09 localhost nova_compute[237281]: 2025-12-06 10:27:09.904 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:09 localhost nova_compute[237281]: 2025-12-06 10:27:09.970 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:10 localhost podman[263617]: Dec 6 05:27:10 localhost podman[263617]: 2025-12-06 10:27:10.889404488 +0000 UTC m=+0.086359261 container create 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:27:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:27:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:27:10 localhost systemd[1]: Started libpod-conmon-4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e.scope. Dec 6 05:27:10 localhost podman[263617]: 2025-12-06 10:27:10.85019521 +0000 UTC m=+0.047149913 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:27:10 localhost systemd[1]: Started libcrun container. Dec 6 05:27:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95067c65d9505423cf514e60a06ac59857bd1f2f1e79c08bcde3525cf96a6569/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:27:10 localhost podman[263617]: 2025-12-06 10:27:10.969729542 +0000 UTC m=+0.166684245 container init 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:27:10 localhost dnsmasq[263657]: started, version 2.85 cachesize 150 Dec 6 05:27:10 localhost dnsmasq[263657]: DNS service limited to local subnets Dec 6 05:27:10 localhost dnsmasq[263657]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:27:10 localhost dnsmasq[263657]: warning: no upstream servers configured Dec 6 05:27:10 localhost dnsmasq-dhcp[263657]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:27:11 localhost podman[263617]: 2025-12-06 10:27:11.01871429 +0000 UTC m=+0.215668994 container start 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Dec 6 05:27:11 localhost dnsmasq[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/addn_hosts - 0 addresses Dec 6 05:27:11 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/host Dec 6 05:27:11 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/opts Dec 6 05:27:11 localhost podman[263632]: 2025-12-06 10:27:11.096142046 +0000 UTC m=+0.151178928 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:27:11 localhost podman[263632]: 2025-12-06 10:27:11.106966419 +0000 UTC m=+0.162003271 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:27:11 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:27:11 localhost podman[263631]: 2025-12-06 10:27:11.20831848 +0000 UTC m=+0.273763322 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:27:11 localhost podman[263631]: 2025-12-06 10:27:11.220380712 +0000 UTC m=+0.285825634 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:27:11 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:27:11 localhost nova_compute[237281]: 2025-12-06 10:27:11.536 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:13 localhost ovn_controller[131684]: 2025-12-06T10:27:13Z|00441|binding|INFO|Setting lport 0bf76b92-a0bd-422d-af31-91d15aa2cd54 up in Southbound Dec 6 05:27:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:13.924 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-21e92d95-c8a1-4ccf-8e3f-294606fc7261', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-21e92d95-c8a1-4ccf-8e3f-294606fc7261', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '077494e5b56a49e2a7c273a073b20032', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c8a331d-c7fd-4791-939c-e2225299c794, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0bf76b92-a0bd-422d-af31-91d15aa2cd54) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:27:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:13.928 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 0bf76b92-a0bd-422d-af31-91d15aa2cd54 in datapath 21e92d95-c8a1-4ccf-8e3f-294606fc7261 bound to our chassis#033[00m Dec 6 05:27:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:13.930 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 21e92d95-c8a1-4ccf-8e3f-294606fc7261 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:27:13 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:13.931 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[7f468c1d-3c60-42cd-9547-13145539c760]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:27:14 localhost nova_compute[237281]: 2025-12-06 10:27:14.301 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:14.587 219384 INFO neutron.agent.dhcp.agent [None req-4abceb0e-8e5f-4b47-8d08-ecc3d9304a5c - - - - - -] DHCP configuration for ports {'f1a43f40-ec5b-4bc2-a1ba-4f6b33b0a59b'} is completed#033[00m Dec 6 05:27:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:14.642 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:27:09Z, description=, device_id=00956df1-b6b8-4bd0-a930-d2499479c932, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b05b40c-fc39-4764-87e4-02c0b403c23f, ip_allocation=immediate, mac_address=fa:16:3e:2f:73:cb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:01Z, description=, dns_domain=, id=21e92d95-c8a1-4ccf-8e3f-294606fc7261, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-759297711, port_security_enabled=True, project_id=077494e5b56a49e2a7c273a073b20032, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43642, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2375, status=ACTIVE, subnets=['55778863-ac8c-461f-8908-942003296999'], tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:04Z, vlan_transparent=None, network_id=21e92d95-c8a1-4ccf-8e3f-294606fc7261, port_security_enabled=False, project_id=077494e5b56a49e2a7c273a073b20032, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2395, status=DOWN, tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:09Z on network 21e92d95-c8a1-4ccf-8e3f-294606fc7261#033[00m Dec 6 05:27:14 localhost nova_compute[237281]: 2025-12-06 10:27:14.784 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:14.787 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:27:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:14.789 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:27:14 localhost dnsmasq[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/addn_hosts - 1 addresses Dec 6 05:27:14 localhost podman[263694]: 2025-12-06 10:27:14.871656359 +0000 UTC m=+0.064490638 container kill 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Dec 6 05:27:14 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/host Dec 6 05:27:14 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/opts Dec 6 05:27:15 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:15.188 219384 INFO neutron.agent.dhcp.agent [None req-2517cef8-cdaf-4026-8374-161e980c9e91 - - - - - -] DHCP configuration for ports {'7b05b40c-fc39-4764-87e4-02c0b403c23f'} is completed#033[00m Dec 6 05:27:16 localhost openstack_network_exporter[199751]: ERROR 10:27:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:27:16 localhost openstack_network_exporter[199751]: ERROR 10:27:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:27:16 localhost openstack_network_exporter[199751]: ERROR 10:27:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:27:16 localhost openstack_network_exporter[199751]: ERROR 10:27:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:27:16 localhost openstack_network_exporter[199751]: Dec 6 05:27:16 localhost openstack_network_exporter[199751]: ERROR 10:27:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:27:16 localhost openstack_network_exporter[199751]: Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.339 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.378 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.379 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.380 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.436 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.436 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.437 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.437 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.529 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.549 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:16 localhost podman[263716]: 2025-12-06 10:27:16.582874763 +0000 UTC m=+0.103287323 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.607 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.608 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:27:16 localhost podman[263716]: 2025-12-06 10:27:16.61461748 +0000 UTC m=+0.135030070 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:27:16 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.666 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.667 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:27:16 localhost podman[263717]: 2025-12-06 10:27:16.683173271 +0000 UTC m=+0.198087371 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:27:16 localhost podman[263717]: 2025-12-06 10:27:16.722472611 +0000 UTC m=+0.237386711 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute) Dec 6 05:27:16 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.761 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.763 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:27:16 localhost nova_compute[237281]: 2025-12-06 10:27:16.854 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:27:16 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:16.876 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:27:09Z, description=, device_id=00956df1-b6b8-4bd0-a930-d2499479c932, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b05b40c-fc39-4764-87e4-02c0b403c23f, ip_allocation=immediate, mac_address=fa:16:3e:2f:73:cb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:01Z, description=, dns_domain=, id=21e92d95-c8a1-4ccf-8e3f-294606fc7261, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-759297711, port_security_enabled=True, project_id=077494e5b56a49e2a7c273a073b20032, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43642, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2375, status=ACTIVE, subnets=['55778863-ac8c-461f-8908-942003296999'], tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:04Z, vlan_transparent=None, network_id=21e92d95-c8a1-4ccf-8e3f-294606fc7261, port_security_enabled=False, project_id=077494e5b56a49e2a7c273a073b20032, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2395, status=DOWN, tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:09Z on network 21e92d95-c8a1-4ccf-8e3f-294606fc7261#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.014 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.016 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12254MB free_disk=387.2635726928711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.016 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.017 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:27:17 localhost dnsmasq[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/addn_hosts - 1 addresses Dec 6 05:27:17 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/host Dec 6 05:27:17 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/opts Dec 6 05:27:17 localhost podman[263781]: 2025-12-06 10:27:17.055904791 +0000 UTC m=+0.050952331 container kill 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:27:17 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:17.079 2 INFO neutron.agent.securitygroups_rpc [None req-ff0de34e-711e-4876-8a81-0a9fa5840d47 f255b59cfdc54c808094207c037d327f 1d62107f03194685a0d4a3a8f59ce292 - - default default] Security group member updated ['79b7e845-f7b6-4058-9d43-847a6176d91c']#033[00m Dec 6 05:27:17 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:17.434 219384 INFO neutron.agent.dhcp.agent [None req-fb371bff-4eca-4b08-91f0-232706fd274f - - - - - -] DHCP configuration for ports {'7b05b40c-fc39-4764-87e4-02c0b403c23f'} is completed#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.518 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.520 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.521 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:27:17 localhost nova_compute[237281]: 2025-12-06 10:27:17.677 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing inventories for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.595 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.684 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating ProviderTree inventory for provider db8b39ad-af52-43e3-99e2-f3c431f03241 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.685 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Updating inventory in ProviderTree for provider db8b39ad-af52-43e3-99e2-f3c431f03241 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.716 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing aggregate associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.765 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Refreshing trait associations for resource provider db8b39ad-af52-43e3-99e2-f3c431f03241, traits: COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_BMI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.837 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.873 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.877 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.877 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.878 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.878 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 6 05:27:18 localhost nova_compute[237281]: 2025-12-06 10:27:18.905 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 6 05:27:19 localhost nova_compute[237281]: 2025-12-06 10:27:19.303 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:20.193 219384 INFO neutron.agent.linux.ip_lib [None req-e044f505-6b62-4ce4-95d5-362856a4c52f - - - - - -] Device tap59992555-4b cannot be used as it has no MAC address#033[00m Dec 6 05:27:20 localhost nova_compute[237281]: 2025-12-06 10:27:20.224 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:20 localhost kernel: device tap59992555-4b entered promiscuous mode Dec 6 05:27:20 localhost NetworkManager[5965]: [1765016840.2349] manager: (tap59992555-4b): new Generic device (/org/freedesktop/NetworkManager/Devices/73) Dec 6 05:27:20 localhost ovn_controller[131684]: 2025-12-06T10:27:20Z|00442|binding|INFO|Claiming lport 59992555-4bbe-435e-9eaf-0d00fa326c19 for this chassis. Dec 6 05:27:20 localhost ovn_controller[131684]: 2025-12-06T10:27:20Z|00443|binding|INFO|59992555-4bbe-435e-9eaf-0d00fa326c19: Claiming unknown Dec 6 05:27:20 localhost nova_compute[237281]: 2025-12-06 10:27:20.236 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:20 localhost systemd-udevd[263813]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:27:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:20.249 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-d2b21c6f-0ea5-46e8-8536-0b3264253aa0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2b21c6f-0ea5-46e8-8536-0b3264253aa0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08bd06af2dd148eaa3f0a5c4a8d7f98c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4d87aa46-7e01-49f3-931b-b8bb01e76d39, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59992555-4bbe-435e-9eaf-0d00fa326c19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:27:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:20.252 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 59992555-4bbe-435e-9eaf-0d00fa326c19 in datapath d2b21c6f-0ea5-46e8-8536-0b3264253aa0 bound to our chassis#033[00m Dec 6 05:27:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:20.254 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d2b21c6f-0ea5-46e8-8536-0b3264253aa0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:27:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:20.255 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[6a96d8e2-56cc-478e-88c3-e035a26b40ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost ovn_controller[131684]: 2025-12-06T10:27:20Z|00444|binding|INFO|Setting lport 59992555-4bbe-435e-9eaf-0d00fa326c19 ovn-installed in OVS Dec 6 05:27:20 localhost ovn_controller[131684]: 2025-12-06T10:27:20Z|00445|binding|INFO|Setting lport 59992555-4bbe-435e-9eaf-0d00fa326c19 up in Southbound Dec 6 05:27:20 localhost nova_compute[237281]: 2025-12-06 10:27:20.280 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost journal[186952]: ethtool ioctl error on tap59992555-4b: No such device Dec 6 05:27:20 localhost nova_compute[237281]: 2025-12-06 10:27:20.338 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:20 localhost nova_compute[237281]: 2025-12-06 10:27:20.376 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:20.791 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:27:21 localhost nova_compute[237281]: 2025-12-06 10:27:21.542 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:21 localhost podman[263885]: Dec 6 05:27:21 localhost podman[263885]: 2025-12-06 10:27:21.806530697 +0000 UTC m=+0.089299932 container create 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 6 05:27:21 localhost podman[263885]: 2025-12-06 10:27:21.754704151 +0000 UTC m=+0.037473426 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:27:21 localhost systemd[1]: Started libpod-conmon-7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57.scope. Dec 6 05:27:21 localhost systemd[1]: Started libcrun container. Dec 6 05:27:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/639dfb595c9229cd85f7b1f45e73ee24ac09c1f89fbd8fa49f97cbad0c18523b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:27:21 localhost podman[263885]: 2025-12-06 10:27:21.891502374 +0000 UTC m=+0.174271579 container init 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 6 05:27:21 localhost podman[263885]: 2025-12-06 10:27:21.899532431 +0000 UTC m=+0.182301636 container start 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:27:21 localhost dnsmasq[263903]: started, version 2.85 cachesize 150 Dec 6 05:27:21 localhost dnsmasq[263903]: DNS service limited to local subnets Dec 6 05:27:21 localhost dnsmasq[263903]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:27:21 localhost dnsmasq[263903]: warning: no upstream servers configured Dec 6 05:27:21 localhost dnsmasq-dhcp[263903]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:27:21 localhost dnsmasq[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/addn_hosts - 0 addresses Dec 6 05:27:21 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/host Dec 6 05:27:21 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/opts Dec 6 05:27:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:22.615 219384 INFO neutron.agent.dhcp.agent [None req-aa6318c3-8cbf-4756-b5ea-8b0c94fc8482 - - - - - -] DHCP configuration for ports {'98b63cba-4377-44e6-a3bb-d87e71df44e3'} is completed#033[00m Dec 6 05:27:23 localhost podman[197801]: time="2025-12-06T10:27:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:27:23 localhost podman[197801]: @ - - [06/Dec/2025:10:27:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147790 "" "Go-http-client/1.1" Dec 6 05:27:23 localhost podman[197801]: @ - - [06/Dec/2025:10:27:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16921 "" "Go-http-client/1.1" Dec 6 05:27:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3970 DF PROTO=TCP SPT=59696 DPT=9102 SEQ=1362246866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE1F6E90000000001030307) Dec 6 05:27:24 localhost nova_compute[237281]: 2025-12-06 10:27:24.308 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3971 DF PROTO=TCP SPT=59696 DPT=9102 SEQ=1362246866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE1FB070000000001030307) Dec 6 05:27:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:27:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58535 DF PROTO=TCP SPT=32936 DPT=9102 SEQ=3636148708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE1FD870000000001030307) Dec 6 05:27:25 localhost podman[263906]: 2025-12-06 10:27:25.764286322 +0000 UTC m=+0.102867350 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 6 05:27:25 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:25.764 219384 INFO neutron.agent.linux.ip_lib [None req-21ccac83-5fe5-45c4-bf11-3345787a3fcf - - - - - -] Device tap1f25a9ec-ae cannot be used as it has no MAC address#033[00m Dec 6 05:27:25 localhost podman[263906]: 2025-12-06 10:27:25.78209554 +0000 UTC m=+0.120676568 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container) Dec 6 05:27:25 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:27:25 localhost nova_compute[237281]: 2025-12-06 10:27:25.808 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:25 localhost kernel: device tap1f25a9ec-ae entered promiscuous mode Dec 6 05:27:25 localhost ovn_controller[131684]: 2025-12-06T10:27:25Z|00446|binding|INFO|Claiming lport 1f25a9ec-aeb6-44de-982b-5f035947bcf9 for this chassis. Dec 6 05:27:25 localhost ovn_controller[131684]: 2025-12-06T10:27:25Z|00447|binding|INFO|1f25a9ec-aeb6-44de-982b-5f035947bcf9: Claiming unknown Dec 6 05:27:25 localhost NetworkManager[5965]: [1765016845.8197] manager: (tap1f25a9ec-ae): new Generic device (/org/freedesktop/NetworkManager/Devices/74) Dec 6 05:27:25 localhost nova_compute[237281]: 2025-12-06 10:27:25.817 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:25 localhost systemd-udevd[263934]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:27:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:25.842 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb5:55cf/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f0406981-d284-4d73-b531-593dfcba6b7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0406981-d284-4d73-b531-593dfcba6b7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78c73d786bd44858b8138f7bd26dbb60', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a0db9ba-2d6b-4e2b-ac43-15bb0eb4810c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1f25a9ec-aeb6-44de-982b-5f035947bcf9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:27:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:25.845 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 1f25a9ec-aeb6-44de-982b-5f035947bcf9 in datapath f0406981-d284-4d73-b531-593dfcba6b7c bound to our chassis#033[00m Dec 6 05:27:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:25.848 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port a9ba3bd7-99b9-4581-9d4e-6f5c81447cb9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:27:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:25.849 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0406981-d284-4d73-b531-593dfcba6b7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:27:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:25.850 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[f8490e8a-b161-4225-835b-a1997cdb4669]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:27:25 localhost ovn_controller[131684]: 2025-12-06T10:27:25Z|00448|binding|INFO|Setting lport 1f25a9ec-aeb6-44de-982b-5f035947bcf9 ovn-installed in OVS Dec 6 05:27:25 localhost ovn_controller[131684]: 2025-12-06T10:27:25Z|00449|binding|INFO|Setting lport 1f25a9ec-aeb6-44de-982b-5f035947bcf9 up in Southbound Dec 6 05:27:25 localhost nova_compute[237281]: 2025-12-06 10:27:25.866 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:25 localhost nova_compute[237281]: 2025-12-06 10:27:25.912 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:25 localhost nova_compute[237281]: 2025-12-06 10:27:25.947 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:26 localhost ovn_controller[131684]: 2025-12-06T10:27:26Z|00450|binding|INFO|Removing iface tap1f25a9ec-ae ovn-installed in OVS Dec 6 05:27:26 localhost ovn_controller[131684]: 2025-12-06T10:27:26Z|00451|binding|INFO|Removing lport 1f25a9ec-aeb6-44de-982b-5f035947bcf9 ovn-installed in OVS Dec 6 05:27:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:26.257 137259 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a9ba3bd7-99b9-4581-9d4e-6f5c81447cb9 with type ""#033[00m Dec 6 05:27:26 localhost nova_compute[237281]: 2025-12-06 10:27:26.258 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:26.259 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb5:55cf/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-f0406981-d284-4d73-b531-593dfcba6b7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0406981-d284-4d73-b531-593dfcba6b7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78c73d786bd44858b8138f7bd26dbb60', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a0db9ba-2d6b-4e2b-ac43-15bb0eb4810c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1f25a9ec-aeb6-44de-982b-5f035947bcf9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:27:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:26.262 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 1f25a9ec-aeb6-44de-982b-5f035947bcf9 in datapath f0406981-d284-4d73-b531-593dfcba6b7c unbound from our chassis#033[00m Dec 6 05:27:26 localhost nova_compute[237281]: 2025-12-06 10:27:26.264 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:26.265 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0406981-d284-4d73-b531-593dfcba6b7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:27:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:26.267 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[65c247ec-e1ff-40b9-9764-00ad3fa4d2a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:27:26 localhost nova_compute[237281]: 2025-12-06 10:27:26.545 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:26 localhost podman[263987]: Dec 6 05:27:26 localhost podman[263987]: 2025-12-06 10:27:26.841586502 +0000 UTC m=+0.096433301 container create 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:27:26 localhost systemd[1]: Started libpod-conmon-3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f.scope. Dec 6 05:27:26 localhost podman[263987]: 2025-12-06 10:27:26.796573225 +0000 UTC m=+0.051420054 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:27:26 localhost systemd[1]: Started libcrun container. Dec 6 05:27:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cd8ff46ca5e7b86cb9023d8f40c94697ecc27b023968721fd5b1044e55290f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:27:26 localhost podman[263987]: 2025-12-06 10:27:26.928117286 +0000 UTC m=+0.182964105 container init 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:27:26 localhost podman[263987]: 2025-12-06 10:27:26.93861083 +0000 UTC m=+0.193457649 container start 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:27:26 localhost dnsmasq[264006]: started, version 2.85 cachesize 150 Dec 6 05:27:26 localhost dnsmasq[264006]: DNS service limited to local subnets Dec 6 05:27:26 localhost dnsmasq[264006]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:27:26 localhost dnsmasq[264006]: warning: no upstream servers configured Dec 6 05:27:26 localhost dnsmasq[264006]: read /var/lib/neutron/dhcp/f0406981-d284-4d73-b531-593dfcba6b7c/addn_hosts - 0 addresses Dec 6 05:27:27 localhost kernel: device tap1f25a9ec-ae left promiscuous mode Dec 6 05:27:27 localhost nova_compute[237281]: 2025-12-06 10:27:27.067 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:27 localhost nova_compute[237281]: 2025-12-06 10:27:27.084 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3972 DF PROTO=TCP SPT=59696 DPT=9102 SEQ=1362246866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE203070000000001030307) Dec 6 05:27:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21510 DF PROTO=TCP SPT=56394 DPT=9102 SEQ=1756271886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE207870000000001030307) Dec 6 05:27:28 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:28.463 219384 INFO neutron.agent.dhcp.agent [None req-898ae099-0b38-49bb-8a62-5009ad4a1eff - - - - - -] DHCP configuration for ports {'6d4262e7-4af2-424a-9e2b-b1902e1f1059'} is completed#033[00m Dec 6 05:27:29 localhost nova_compute[237281]: 2025-12-06 10:27:29.314 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3973 DF PROTO=TCP SPT=59696 DPT=9102 SEQ=1362246866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE212C70000000001030307) Dec 6 05:27:31 localhost dnsmasq[264006]: read /var/lib/neutron/dhcp/f0406981-d284-4d73-b531-593dfcba6b7c/addn_hosts - 0 addresses Dec 6 05:27:31 localhost podman[264026]: 2025-12-06 10:27:31.377965727 +0000 UTC m=+0.071597596 container kill 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS) Dec 6 05:27:31 localhost systemd[1]: tmp-crun.s8i3OU.mount: Deactivated successfully. Dec 6 05:27:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for f0406981-d284-4d73-b531-593dfcba6b7c.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1f25a9ec-ae not found in namespace qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c. Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent return fut.result() Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent raise self._exception Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1f25a9ec-ae not found in namespace qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c. Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.420 219384 ERROR neutron.agent.dhcp.agent #033[00m Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.423 219384 INFO neutron.agent.dhcp.agent [None req-7ac8dc4e-0073-4b03-ba49-f59d1a486de0 - - - - - -] Synchronizing state#033[00m Dec 6 05:27:31 localhost podman[264040]: 2025-12-06 10:27:31.508929501 +0000 UTC m=+0.099851737 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:27:31 localhost podman[264040]: 2025-12-06 10:27:31.541115602 +0000 UTC m=+0.132037878 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:27:31 localhost nova_compute[237281]: 2025-12-06 10:27:31.548 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:31 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.815 219384 INFO neutron.agent.dhcp.agent [None req-2771f3bc-ed31-4e04-a240-dcbb6e1d365b - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.816 219384 INFO neutron.agent.dhcp.agent [-] Starting network 845aa200-4a3e-4edd-8098-51f7b3728ba4 dhcp configuration#033[00m Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.816 219384 INFO neutron.agent.dhcp.agent [-] Finished network 845aa200-4a3e-4edd-8098-51f7b3728ba4 dhcp configuration#033[00m Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.816 219384 INFO neutron.agent.dhcp.agent [-] Starting network f0406981-d284-4d73-b531-593dfcba6b7c dhcp configuration#033[00m Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.817 219384 INFO neutron.agent.dhcp.agent [-] Finished network f0406981-d284-4d73-b531-593dfcba6b7c dhcp configuration#033[00m Dec 6 05:27:31 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:31.817 219384 INFO neutron.agent.dhcp.agent [None req-2771f3bc-ed31-4e04-a240-dcbb6e1d365b - - - - - -] Synchronizing state complete#033[00m Dec 6 05:27:32 localhost sshd[264088]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:27:32 localhost dnsmasq[264006]: exiting on receipt of SIGTERM Dec 6 05:27:32 localhost podman[264082]: 2025-12-06 10:27:32.085945213 +0000 UTC m=+0.069816571 container kill 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 6 05:27:32 localhost systemd[1]: libpod-3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f.scope: Deactivated successfully. Dec 6 05:27:32 localhost podman[264096]: 2025-12-06 10:27:32.157274839 +0000 UTC m=+0.054986485 container died 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:27:32 localhost podman[264096]: 2025-12-06 10:27:32.189022527 +0000 UTC m=+0.086734153 container cleanup 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:27:32 localhost systemd[1]: libpod-conmon-3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f.scope: Deactivated successfully. Dec 6 05:27:32 localhost podman[264098]: 2025-12-06 10:27:32.244893638 +0000 UTC m=+0.133511873 container remove 3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f0406981-d284-4d73-b531-593dfcba6b7c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:27:32 localhost systemd[1]: var-lib-containers-storage-overlay-1cd8ff46ca5e7b86cb9023d8f40c94697ecc27b023968721fd5b1044e55290f2-merged.mount: Deactivated successfully. Dec 6 05:27:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f800ab5da0de0310cabcd9b3ace072c00abcdbf804a784b3029f44890d0be0f-userdata-shm.mount: Deactivated successfully. Dec 6 05:27:32 localhost systemd[1]: run-netns-qdhcp\x2df0406981\x2dd284\x2d4d73\x2db531\x2d593dfcba6b7c.mount: Deactivated successfully. Dec 6 05:27:32 localhost ovn_controller[131684]: 2025-12-06T10:27:32Z|00452|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:27:32 localhost nova_compute[237281]: 2025-12-06 10:27:32.796 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:34 localhost nova_compute[237281]: 2025-12-06 10:27:34.112 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:34 localhost nova_compute[237281]: 2025-12-06 10:27:34.315 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:35 localhost nova_compute[237281]: 2025-12-06 10:27:35.452 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:36 localhost nova_compute[237281]: 2025-12-06 10:27:36.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:36 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:36.596 2 INFO neutron.agent.securitygroups_rpc [None req-a25fd47d-ebe6-4e39-aeb3-1d23d33055bf f255b59cfdc54c808094207c037d327f 1d62107f03194685a0d4a3a8f59ce292 - - default default] Security group member updated ['79b7e845-f7b6-4058-9d43-847a6176d91c']#033[00m Dec 6 05:27:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:27:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:38.557 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:27:36Z, description=, device_id=85533ed3-4a77-463e-9c98-9991240fbd6a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=31cc7aa3-6b9f-4f92-8702-89c02a173f1a, ip_allocation=immediate, mac_address=fa:16:3e:e6:be:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:09Z, description=, dns_domain=, id=d2b21c6f-0ea5-46e8-8536-0b3264253aa0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1384243426-network, port_security_enabled=True, project_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2394, status=ACTIVE, subnets=['d5bde475-cd32-4579-a5f3-9fedb4bab426'], tags=[], tenant_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, updated_at=2025-12-06T10:27:17Z, vlan_transparent=None, network_id=d2b21c6f-0ea5-46e8-8536-0b3264253aa0, port_security_enabled=False, project_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2446, status=DOWN, tags=[], tenant_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, updated_at=2025-12-06T10:27:37Z on network d2b21c6f-0ea5-46e8-8536-0b3264253aa0#033[00m Dec 6 05:27:38 localhost podman[264128]: 2025-12-06 10:27:38.570788159 +0000 UTC m=+0.098369680 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:27:38 localhost podman[264128]: 2025-12-06 10:27:38.656283893 +0000 UTC m=+0.183865354 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:27:38 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:27:38 localhost dnsmasq[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/addn_hosts - 1 addresses Dec 6 05:27:38 localhost podman[264171]: 2025-12-06 10:27:38.846958305 +0000 UTC m=+0.064337802 container kill 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Dec 6 05:27:38 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/host Dec 6 05:27:38 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/opts Dec 6 05:27:39 localhost nova_compute[237281]: 2025-12-06 10:27:39.317 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3974 DF PROTO=TCP SPT=59696 DPT=9102 SEQ=1362246866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE233870000000001030307) Dec 6 05:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:27:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:41.557 219384 INFO neutron.agent.dhcp.agent [None req-bed7a3bc-c7d9-45d8-8900-5c0703fe4ab3 - - - - - -] DHCP configuration for ports {'31cc7aa3-6b9f-4f92-8702-89c02a173f1a'} is completed#033[00m Dec 6 05:27:41 localhost podman[264191]: 2025-12-06 10:27:41.571344275 +0000 UTC m=+0.089900971 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:27:41 localhost podman[264191]: 2025-12-06 10:27:41.581414464 +0000 UTC m=+0.099971160 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:27:41 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:27:41 localhost nova_compute[237281]: 2025-12-06 10:27:41.599 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:41 localhost podman[264192]: 2025-12-06 10:27:41.679257288 +0000 UTC m=+0.196027219 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:27:41 localhost podman[264192]: 2025-12-06 10:27:41.694260989 +0000 UTC m=+0.211030950 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true) Dec 6 05:27:41 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:27:41 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:41.725 2 INFO neutron.agent.securitygroups_rpc [None req-a23ef962-effb-4815-b0aa-49326e13e3d9 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['c244e805-2851-4e0d-b369-e9badcfb0028']#033[00m Dec 6 05:27:44 localhost nova_compute[237281]: 2025-12-06 10:27:44.320 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:44 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:44.424 2 INFO neutron.agent.securitygroups_rpc [None req-5be7f6bc-ece9-416f-b366-6009b0cc096d 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['c244e805-2851-4e0d-b369-e9badcfb0028']#033[00m Dec 6 05:27:44 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:44.635 219384 INFO neutron.agent.linux.ip_lib [None req-98a1dd1c-1769-4d6c-88e9-466eea2357df - - - - - -] Device tap1b48f4e5-98 cannot be used as it has no MAC address#033[00m Dec 6 05:27:44 localhost nova_compute[237281]: 2025-12-06 10:27:44.669 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:44 localhost kernel: device tap1b48f4e5-98 entered promiscuous mode Dec 6 05:27:44 localhost ovn_controller[131684]: 2025-12-06T10:27:44Z|00453|binding|INFO|Claiming lport 1b48f4e5-985c-4b80-a240-4e134742d5e5 for this chassis. Dec 6 05:27:44 localhost ovn_controller[131684]: 2025-12-06T10:27:44Z|00454|binding|INFO|1b48f4e5-985c-4b80-a240-4e134742d5e5: Claiming unknown Dec 6 05:27:44 localhost nova_compute[237281]: 2025-12-06 10:27:44.681 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:44 localhost NetworkManager[5965]: [1765016864.6835] manager: (tap1b48f4e5-98): new Generic device (/org/freedesktop/NetworkManager/Devices/75) Dec 6 05:27:44 localhost systemd-udevd[264242]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:27:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:44.696 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-554f8bfc-4948-4da6-b83f-b76a7fcdb636', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-554f8bfc-4948-4da6-b83f-b76a7fcdb636', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '077494e5b56a49e2a7c273a073b20032', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2df9c0e2-a153-40e2-8e68-ef3878ced8bb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1b48f4e5-985c-4b80-a240-4e134742d5e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:27:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:44.698 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 1b48f4e5-985c-4b80-a240-4e134742d5e5 in datapath 554f8bfc-4948-4da6-b83f-b76a7fcdb636 bound to our chassis#033[00m Dec 6 05:27:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:44.700 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 554f8bfc-4948-4da6-b83f-b76a7fcdb636 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:27:44 localhost nova_compute[237281]: 2025-12-06 10:27:44.704 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:44 localhost ovn_metadata_agent[137254]: 2025-12-06 10:27:44.702 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae69ce8-1dd7-4986-895d-57abac60cd20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:27:44 localhost ovn_controller[131684]: 2025-12-06T10:27:44Z|00455|binding|INFO|Setting lport 1b48f4e5-985c-4b80-a240-4e134742d5e5 ovn-installed in OVS Dec 6 05:27:44 localhost ovn_controller[131684]: 2025-12-06T10:27:44Z|00456|binding|INFO|Setting lport 1b48f4e5-985c-4b80-a240-4e134742d5e5 up in Southbound Dec 6 05:27:44 localhost nova_compute[237281]: 2025-12-06 10:27:44.725 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:44 localhost nova_compute[237281]: 2025-12-06 10:27:44.778 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:44 localhost nova_compute[237281]: 2025-12-06 10:27:44.814 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:45 localhost podman[264299]: Dec 6 05:27:45 localhost podman[264299]: 2025-12-06 10:27:45.809156085 +0000 UTC m=+0.096663597 container create 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 05:27:45 localhost systemd[1]: Started libpod-conmon-456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63.scope. Dec 6 05:27:45 localhost podman[264299]: 2025-12-06 10:27:45.762713196 +0000 UTC m=+0.050220738 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:27:45 localhost systemd[1]: Started libcrun container. Dec 6 05:27:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c23eee3e52cdd787824232e8d45875c6a97192fab7ad5256fc432da3a710dd2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:27:45 localhost podman[264299]: 2025-12-06 10:27:45.889359246 +0000 UTC m=+0.176866758 container init 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Dec 6 05:27:45 localhost podman[264299]: 2025-12-06 10:27:45.899014753 +0000 UTC m=+0.186522295 container start 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:27:45 localhost dnsmasq[264317]: started, version 2.85 cachesize 150 Dec 6 05:27:45 localhost dnsmasq[264317]: DNS service limited to local subnets Dec 6 05:27:45 localhost dnsmasq[264317]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:27:45 localhost dnsmasq[264317]: warning: no upstream servers configured Dec 6 05:27:45 localhost dnsmasq-dhcp[264317]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d Dec 6 05:27:45 localhost dnsmasq[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/addn_hosts - 0 addresses Dec 6 05:27:45 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/host Dec 6 05:27:45 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/opts Dec 6 05:27:45 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:45.975 219384 INFO neutron.agent.dhcp.agent [None req-98a1dd1c-1769-4d6c-88e9-466eea2357df - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:27:44Z, description=, device_id=00956df1-b6b8-4bd0-a930-d2499479c932, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4fe0c3bd-1610-4d53-86ec-930150d5a0f8, ip_allocation=immediate, mac_address=fa:16:3e:bd:b9:aa, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:36Z, description=, dns_domain=, id=554f8bfc-4948-4da6-b83f-b76a7fcdb636, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-281576504, port_security_enabled=True, project_id=077494e5b56a49e2a7c273a073b20032, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40215, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2444, status=ACTIVE, subnets=['a6621ec1-5a36-4512-bf14-2b00a5b4dfea'], tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:43Z, vlan_transparent=None, network_id=554f8bfc-4948-4da6-b83f-b76a7fcdb636, port_security_enabled=False, project_id=077494e5b56a49e2a7c273a073b20032, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2453, status=DOWN, tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:45Z on network 554f8bfc-4948-4da6-b83f-b76a7fcdb636#033[00m Dec 6 05:27:46 localhost openstack_network_exporter[199751]: ERROR 10:27:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:27:46 localhost openstack_network_exporter[199751]: ERROR 10:27:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:27:46 localhost openstack_network_exporter[199751]: ERROR 10:27:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:27:46 localhost openstack_network_exporter[199751]: ERROR 10:27:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:27:46 localhost openstack_network_exporter[199751]: Dec 6 05:27:46 localhost openstack_network_exporter[199751]: ERROR 10:27:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:27:46 localhost openstack_network_exporter[199751]: Dec 6 05:27:46 localhost dnsmasq[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/addn_hosts - 1 addresses Dec 6 05:27:46 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/host Dec 6 05:27:46 localhost podman[264336]: 2025-12-06 10:27:46.198954961 +0000 UTC m=+0.077599451 container kill 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:27:46 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/opts Dec 6 05:27:46 localhost nova_compute[237281]: 2025-12-06 10:27:46.622 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:27:46 localhost podman[264357]: 2025-12-06 10:27:46.807548816 +0000 UTC m=+0.085587788 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent) Dec 6 05:27:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:27:46 localhost podman[264357]: 2025-12-06 10:27:46.842426829 +0000 UTC m=+0.120465821 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0) Dec 6 05:27:46 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:27:46 localhost podman[264376]: 2025-12-06 10:27:46.936106435 +0000 UTC m=+0.085024250 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:27:46 localhost podman[264376]: 2025-12-06 10:27:46.957296537 +0000 UTC m=+0.106214352 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3) Dec 6 05:27:46 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:27:47 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:47.302 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:27:36Z, description=, device_id=85533ed3-4a77-463e-9c98-9991240fbd6a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=31cc7aa3-6b9f-4f92-8702-89c02a173f1a, ip_allocation=immediate, mac_address=fa:16:3e:e6:be:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:09Z, description=, dns_domain=, id=d2b21c6f-0ea5-46e8-8536-0b3264253aa0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1384243426-network, port_security_enabled=True, project_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2394, status=ACTIVE, subnets=['d5bde475-cd32-4579-a5f3-9fedb4bab426'], tags=[], tenant_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, updated_at=2025-12-06T10:27:17Z, vlan_transparent=None, network_id=d2b21c6f-0ea5-46e8-8536-0b3264253aa0, port_security_enabled=False, project_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2446, status=DOWN, tags=[], tenant_id=08bd06af2dd148eaa3f0a5c4a8d7f98c, updated_at=2025-12-06T10:27:37Z on network d2b21c6f-0ea5-46e8-8536-0b3264253aa0#033[00m Dec 6 05:27:47 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:47.567 219384 INFO neutron.agent.dhcp.agent [None req-a95d92ca-92f6-42a1-8d8a-e293218810c3 - - - - - -] DHCP configuration for ports {'cb96b1f6-c475-44aa-8d18-a06420a76643'} is completed#033[00m Dec 6 05:27:47 localhost podman[264414]: 2025-12-06 10:27:47.728951424 +0000 UTC m=+0.064965842 container kill 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:27:47 localhost dnsmasq[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/addn_hosts - 1 addresses Dec 6 05:27:47 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/host Dec 6 05:27:47 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/opts Dec 6 05:27:47 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:47.763 219384 INFO neutron.agent.dhcp.agent [None req-31bad2ea-6008-4865-866d-68d6e23696d6 - - - - - -] DHCP configuration for ports {'4fe0c3bd-1610-4d53-86ec-930150d5a0f8'} is completed#033[00m Dec 6 05:27:49 localhost nova_compute[237281]: 2025-12-06 10:27:49.323 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:49 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:49.528 219384 INFO neutron.agent.dhcp.agent [None req-313771e6-2570-4c1d-ae49-21218ec1c42e - - - - - -] DHCP configuration for ports {'31cc7aa3-6b9f-4f92-8702-89c02a173f1a'} is completed#033[00m Dec 6 05:27:50 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:50.943 2 INFO neutron.agent.securitygroups_rpc [None req-739332fc-990b-47ac-b745-38fee9e879db 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:27:51 localhost nova_compute[237281]: 2025-12-06 10:27:51.637 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:52 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:52.270 2 INFO neutron.agent.securitygroups_rpc [None req-071380e2-26df-4864-be6f-020b13166ab6 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:27:52 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:52.291 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:27:44Z, description=, device_id=00956df1-b6b8-4bd0-a930-d2499479c932, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4fe0c3bd-1610-4d53-86ec-930150d5a0f8, ip_allocation=immediate, mac_address=fa:16:3e:bd:b9:aa, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:36Z, description=, dns_domain=, id=554f8bfc-4948-4da6-b83f-b76a7fcdb636, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-281576504, port_security_enabled=True, project_id=077494e5b56a49e2a7c273a073b20032, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40215, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2444, status=ACTIVE, subnets=['a6621ec1-5a36-4512-bf14-2b00a5b4dfea'], tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:43Z, vlan_transparent=None, network_id=554f8bfc-4948-4da6-b83f-b76a7fcdb636, port_security_enabled=False, project_id=077494e5b56a49e2a7c273a073b20032, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2453, status=DOWN, tags=[], tenant_id=077494e5b56a49e2a7c273a073b20032, updated_at=2025-12-06T10:27:45Z on network 554f8bfc-4948-4da6-b83f-b76a7fcdb636#033[00m Dec 6 05:27:52 localhost dnsmasq[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/addn_hosts - 1 addresses Dec 6 05:27:52 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/host Dec 6 05:27:52 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/opts Dec 6 05:27:52 localhost podman[264456]: 2025-12-06 10:27:52.489230197 +0000 UTC m=+0.048189185 container kill 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:27:53 localhost podman[197801]: time="2025-12-06T10:27:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:27:53 localhost podman[197801]: @ - - [06/Dec/2025:10:27:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149609 "" "Go-http-client/1.1" Dec 6 05:27:53 localhost podman[197801]: @ - - [06/Dec/2025:10:27:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17400 "" "Go-http-client/1.1" Dec 6 05:27:53 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:27:53.535 219384 INFO neutron.agent.dhcp.agent [None req-2dc06953-ebd0-4274-a211-dc7126bbc9dc - - - - - -] DHCP configuration for ports {'4fe0c3bd-1610-4d53-86ec-930150d5a0f8'} is completed#033[00m Dec 6 05:27:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44659 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=761886991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE26C280000000001030307) Dec 6 05:27:54 localhost nova_compute[237281]: 2025-12-06 10:27:54.326 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44660 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=761886991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE270470000000001030307) Dec 6 05:27:55 localhost nova_compute[237281]: 2025-12-06 10:27:55.410 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:55 localhost nova_compute[237281]: 2025-12-06 10:27:55.411 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:55 localhost nova_compute[237281]: 2025-12-06 10:27:55.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3975 DF PROTO=TCP SPT=59696 DPT=9102 SEQ=1362246866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE273870000000001030307) Dec 6 05:27:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:27:56 localhost podman[264477]: 2025-12-06 10:27:56.544859438 +0000 UTC m=+0.080528441 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41) Dec 6 05:27:56 localhost podman[264477]: 2025-12-06 10:27:56.556403914 +0000 UTC m=+0.092072947 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.) Dec 6 05:27:56 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:27:56 localhost nova_compute[237281]: 2025-12-06 10:27:56.657 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44661 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=761886991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE278480000000001030307) Dec 6 05:27:57 localhost neutron_sriov_agent[212548]: 2025-12-06 10:27:57.692 2 INFO neutron.agent.securitygroups_rpc [None req-39a272a3-1bf3-4308-a59c-a33ac340f8ba 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:27:57 localhost nova_compute[237281]: 2025-12-06 10:27:57.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58536 DF PROTO=TCP SPT=32936 DPT=9102 SEQ=3636148708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE27B870000000001030307) Dec 6 05:27:59 localhost nova_compute[237281]: 2025-12-06 10:27:59.330 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:27:59 localhost nova_compute[237281]: 2025-12-06 10:27:59.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:59 localhost nova_compute[237281]: 2025-12-06 10:27:59.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:27:59 localhost nova_compute[237281]: 2025-12-06 10:27:59.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:28:00 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:00.050 2 INFO neutron.agent.securitygroups_rpc [None req-5e9ab218-d3ea-4bee-a558-0ad37af2f8d2 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:28:00 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:00.862 2 INFO neutron.agent.securitygroups_rpc [None req-6982f5a9-ebbc-4c00-9f27-3f370407f85d 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:28:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44662 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=761886991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE288070000000001030307) Dec 6 05:28:01 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:01.477 2 INFO neutron.agent.securitygroups_rpc [None req-de162a29-d1b9-4afd-b16a-afea8cd50b17 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:28:01 localhost dnsmasq[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/addn_hosts - 0 addresses Dec 6 05:28:01 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/host Dec 6 05:28:01 localhost podman[264513]: 2025-12-06 10:28:01.542899422 +0000 UTC m=+0.062309541 container kill 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:28:01 localhost dnsmasq-dhcp[263903]: read /var/lib/neutron/dhcp/d2b21c6f-0ea5-46e8-8536-0b3264253aa0/opts Dec 6 05:28:01 localhost nova_compute[237281]: 2025-12-06 10:28:01.702 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:01 localhost ovn_controller[131684]: 2025-12-06T10:28:01Z|00457|binding|INFO|Releasing lport 59992555-4bbe-435e-9eaf-0d00fa326c19 from this chassis (sb_readonly=0) Dec 6 05:28:01 localhost kernel: device tap59992555-4b left promiscuous mode Dec 6 05:28:01 localhost ovn_controller[131684]: 2025-12-06T10:28:01Z|00458|binding|INFO|Setting lport 59992555-4bbe-435e-9eaf-0d00fa326c19 down in Southbound Dec 6 05:28:01 localhost nova_compute[237281]: 2025-12-06 10:28:01.800 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:01.809 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-d2b21c6f-0ea5-46e8-8536-0b3264253aa0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2b21c6f-0ea5-46e8-8536-0b3264253aa0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '08bd06af2dd148eaa3f0a5c4a8d7f98c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4d87aa46-7e01-49f3-931b-b8bb01e76d39, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59992555-4bbe-435e-9eaf-0d00fa326c19) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:28:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:01.811 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 59992555-4bbe-435e-9eaf-0d00fa326c19 in datapath d2b21c6f-0ea5-46e8-8536-0b3264253aa0 unbound from our chassis#033[00m Dec 6 05:28:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:01.814 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d2b21c6f-0ea5-46e8-8536-0b3264253aa0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:28:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:01.815 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[35bcd51d-4831-41c6-ad00-256ca498adfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:28:01 localhost nova_compute[237281]: 2025-12-06 10:28:01.817 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:28:01 localhost nova_compute[237281]: 2025-12-06 10:28:01.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:01 localhost systemd[1]: tmp-crun.lTAqUW.mount: Deactivated successfully. Dec 6 05:28:01 localhost podman[264537]: 2025-12-06 10:28:01.921251575 +0000 UTC m=+0.088755425 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:28:01 localhost podman[264537]: 2025-12-06 10:28:01.95551183 +0000 UTC m=+0.123015710 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:28:01 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:28:02 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:02.316 219384 INFO neutron.agent.linux.ip_lib [None req-e01836ef-78ff-4d08-b100-11c08d8c49d8 - - - - - -] Device tapcc4d5c33-1e cannot be used as it has no MAC address#033[00m Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.345 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:02 localhost kernel: device tapcc4d5c33-1e entered promiscuous mode Dec 6 05:28:02 localhost ovn_controller[131684]: 2025-12-06T10:28:02Z|00459|binding|INFO|Claiming lport cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721 for this chassis. Dec 6 05:28:02 localhost ovn_controller[131684]: 2025-12-06T10:28:02Z|00460|binding|INFO|cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721: Claiming unknown Dec 6 05:28:02 localhost NetworkManager[5965]: [1765016882.3543] manager: (tapcc4d5c33-1e): new Generic device (/org/freedesktop/NetworkManager/Devices/76) Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.356 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:02 localhost systemd-udevd[264569]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:28:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:02.372 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-873d6d35-acd8-44ae-bfa2-18285967f797', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-873d6d35-acd8-44ae-bfa2-18285967f797', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab98f9d3-c4c1-47f7-b206-862e9571ef2d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:28:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:02.374 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721 in datapath 873d6d35-acd8-44ae-bfa2-18285967f797 bound to our chassis#033[00m Dec 6 05:28:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:02.375 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 873d6d35-acd8-44ae-bfa2-18285967f797 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:28:02 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:02.376 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[a895a4b1-6b8a-4a42-8a1d-a106fb8c9481]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:28:02 localhost ovn_controller[131684]: 2025-12-06T10:28:02Z|00461|binding|INFO|Setting lport cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721 ovn-installed in OVS Dec 6 05:28:02 localhost ovn_controller[131684]: 2025-12-06T10:28:02Z|00462|binding|INFO|Setting lport cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721 up in Southbound Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.392 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost journal[186952]: ethtool ioctl error on tapcc4d5c33-1e: No such device Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.432 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.458 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:02 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:02.875 2 INFO neutron.agent.securitygroups_rpc [None req-5b0f5db1-48f6-4c18-b56b-80e7e27893df 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:28:02 localhost nova_compute[237281]: 2025-12-06 10:28:02.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:28:03 localhost podman[264640]: Dec 6 05:28:03 localhost podman[264640]: 2025-12-06 10:28:03.410492622 +0000 UTC m=+0.105328565 container create d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 6 05:28:03 localhost systemd[1]: Started libpod-conmon-d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f.scope. Dec 6 05:28:03 localhost podman[264640]: 2025-12-06 10:28:03.360594666 +0000 UTC m=+0.055430619 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:28:03 localhost systemd[1]: tmp-crun.70tCYK.mount: Deactivated successfully. Dec 6 05:28:03 localhost systemd[1]: Started libcrun container. Dec 6 05:28:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8b860880b2ccff72e53885ed639178c53299f6f77e9737429d9e1e87dbdbbbc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:28:03 localhost nova_compute[237281]: 2025-12-06 10:28:03.500 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:28:03 localhost nova_compute[237281]: 2025-12-06 10:28:03.500 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:28:03 localhost nova_compute[237281]: 2025-12-06 10:28:03.501 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:28:03 localhost nova_compute[237281]: 2025-12-06 10:28:03.501 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:28:03 localhost podman[264640]: 2025-12-06 10:28:03.512225856 +0000 UTC m=+0.207061799 container init d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:28:03 localhost podman[264640]: 2025-12-06 10:28:03.521550532 +0000 UTC m=+0.216386485 container start d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:28:03 localhost dnsmasq[264658]: started, version 2.85 cachesize 150 Dec 6 05:28:03 localhost dnsmasq[264658]: DNS service limited to local subnets Dec 6 05:28:03 localhost dnsmasq[264658]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:28:03 localhost dnsmasq[264658]: warning: no upstream servers configured Dec 6 05:28:03 localhost dnsmasq-dhcp[264658]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:28:03 localhost dnsmasq[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/addn_hosts - 0 addresses Dec 6 05:28:03 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/host Dec 6 05:28:03 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/opts Dec 6 05:28:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:03.976 219384 INFO neutron.agent.dhcp.agent [None req-41eb46ea-0d83-41bb-a3f5-3eeda9a7b099 - - - - - -] DHCP configuration for ports {'d94997bc-a72f-41b4-b179-b4eac4402f8f'} is completed#033[00m Dec 6 05:28:04 localhost nova_compute[237281]: 2025-12-06 10:28:04.333 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:04 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:04.555 2 INFO neutron.agent.securitygroups_rpc [None req-94b2963c-16a8-4097-9094-81ef4a8d2e55 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:28:06 localhost nova_compute[237281]: 2025-12-06 10:28:06.106 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:06.711 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:28:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:06.713 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:28:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:06.714 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:28:06 localhost nova_compute[237281]: 2025-12-06 10:28:06.730 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:07 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:07.045 2 INFO neutron.agent.securitygroups_rpc [None req-ae4cc6e2-bda1-4062-8e39-f90cd113b9a4 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:28:07 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:07.673 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:28:07Z, description=, device_id=ee8a63ca-324a-46ce-9f96-402eff4ee1d0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ed8eaf13-d859-4af5-ae30-5f784f408806, ip_allocation=immediate, mac_address=fa:16:3e:0d:5c:9c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:51Z, description=, dns_domain=, id=873d6d35-acd8-44ae-bfa2-18285967f797, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-752450543, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5256, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2461, status=ACTIVE, subnets=['ee7655a1-531c-4609-a5fe-cafd36b003a7'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:27:59Z, vlan_transparent=None, network_id=873d6d35-acd8-44ae-bfa2-18285967f797, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2479, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:28:07Z on network 873d6d35-acd8-44ae-bfa2-18285967f797#033[00m Dec 6 05:28:07 localhost podman[264676]: 2025-12-06 10:28:07.885947142 +0000 UTC m=+0.055980205 container kill d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:28:07 localhost dnsmasq[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/addn_hosts - 1 addresses Dec 6 05:28:07 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/host Dec 6 05:28:07 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/opts Dec 6 05:28:07 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:07.910 2 INFO neutron.agent.securitygroups_rpc [None req-f1632f5e-10df-47cb-b147-4a166e02f3db c0e85d36c6044d9697993a40206d7832 78c73d786bd44858b8138f7bd26dbb60 - - default default] Security group member updated ['5ab7045d-e25f-45c1-856b-35518613c6cf']#033[00m Dec 6 05:28:08 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:08.472 219384 INFO neutron.agent.dhcp.agent [None req-1a7238f6-88c5-47d8-8938-9a1faafa81f4 - - - - - -] DHCP configuration for ports {'ed8eaf13-d859-4af5-ae30-5f784f408806'} is completed#033[00m Dec 6 05:28:08 localhost ovn_controller[131684]: 2025-12-06T10:28:08Z|00463|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.508 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.543 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.544 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.545 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:08 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:08.564 2 INFO neutron.agent.securitygroups_rpc [None req-033e00fc-a98f-4fa6-addc-752860df173a 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['99a3b4be-dcaf-47c6-9a48-00a7eb6a06f4']#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.573 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.596 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.597 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.597 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.598 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.691 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.765 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.769 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.831 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.832 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.915 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.916 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:28:08 localhost nova_compute[237281]: 2025-12-06 10:28:08.991 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:28:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44663 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=761886991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE2A7870000000001030307) Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.258 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.261 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12261MB free_disk=387.26342010498047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.261 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.262 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:28:09 localhost dnsmasq[263903]: exiting on receipt of SIGTERM Dec 6 05:28:09 localhost podman[264727]: 2025-12-06 10:28:09.332008639 +0000 UTC m=+0.054969574 container kill 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.335 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:28:09 localhost systemd[1]: libpod-7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57.scope: Deactivated successfully. Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.370 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.371 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.371 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:28:09 localhost podman[264743]: 2025-12-06 10:28:09.409192166 +0000 UTC m=+0.056131030 container died 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:28:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57-userdata-shm.mount: Deactivated successfully. Dec 6 05:28:09 localhost systemd[1]: var-lib-containers-storage-overlay-639dfb595c9229cd85f7b1f45e73ee24ac09c1f89fbd8fa49f97cbad0c18523b-merged.mount: Deactivated successfully. Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.458 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.482 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.485 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:28:09 localhost nova_compute[237281]: 2025-12-06 10:28:09.486 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:28:09 localhost podman[264743]: 2025-12-06 10:28:09.515543092 +0000 UTC m=+0.162481876 container remove 7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d2b21c6f-0ea5-46e8-8536-0b3264253aa0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:28:09 localhost systemd[1]: libpod-conmon-7fb768737b3f3301c64834c3e8207699ec115a264dce01bf2be57cd91383cc57.scope: Deactivated successfully. Dec 6 05:28:09 localhost podman[264744]: 2025-12-06 10:28:09.598703812 +0000 UTC m=+0.239635601 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:28:09 localhost podman[264744]: 2025-12-06 10:28:09.68272888 +0000 UTC m=+0.323660629 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:28:09 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:09.684 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:09 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:28:09 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:09.928 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:10 localhost systemd[1]: run-netns-qdhcp\x2dd2b21c6f\x2d0ea5\x2d46e8\x2d8536\x2d0b3264253aa0.mount: Deactivated successfully. Dec 6 05:28:10 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:10.897 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:28:07Z, description=, device_id=ee8a63ca-324a-46ce-9f96-402eff4ee1d0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ed8eaf13-d859-4af5-ae30-5f784f408806, ip_allocation=immediate, mac_address=fa:16:3e:0d:5c:9c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:27:51Z, description=, dns_domain=, id=873d6d35-acd8-44ae-bfa2-18285967f797, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-752450543, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5256, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2461, status=ACTIVE, subnets=['ee7655a1-531c-4609-a5fe-cafd36b003a7'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:27:59Z, vlan_transparent=None, network_id=873d6d35-acd8-44ae-bfa2-18285967f797, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2479, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:28:07Z on network 873d6d35-acd8-44ae-bfa2-18285967f797#033[00m Dec 6 05:28:11 localhost podman[264805]: 2025-12-06 10:28:11.134698971 +0000 UTC m=+0.061084572 container kill d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:28:11 localhost dnsmasq[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/addn_hosts - 1 addresses Dec 6 05:28:11 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/host Dec 6 05:28:11 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/opts Dec 6 05:28:11 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:11.155 2 INFO neutron.agent.securitygroups_rpc [None req-68bf4f59-5ab6-4c69-99ac-93d58de4e00e c0e85d36c6044d9697993a40206d7832 78c73d786bd44858b8138f7bd26dbb60 - - default default] Security group member updated ['5ab7045d-e25f-45c1-856b-35518613c6cf']#033[00m Dec 6 05:28:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:11.185 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:11 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:11.501 219384 INFO neutron.agent.dhcp.agent [None req-9adce2db-1769-4b74-a34b-8505a013e83e - - - - - -] DHCP configuration for ports {'ed8eaf13-d859-4af5-ae30-5f784f408806'} is completed#033[00m Dec 6 05:28:11 localhost nova_compute[237281]: 2025-12-06 10:28:11.734 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:11 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:11.779 2 INFO neutron.agent.securitygroups_rpc [None req-7db0df5f-5e39-4e53-9f9c-4c8874f51ca4 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['ceb6e7e1-fe01-4236-9a0f-2ff4945d30a2']#033[00m Dec 6 05:28:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:12.424 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:28:12 localhost podman[264826]: 2025-12-06 10:28:12.571514793 +0000 UTC m=+0.097972819 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:28:12 localhost podman[264826]: 2025-12-06 10:28:12.581366046 +0000 UTC m=+0.107824072 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:28:12 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:28:12 localhost podman[264827]: 2025-12-06 10:28:12.670328416 +0000 UTC m=+0.194540723 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd) Dec 6 05:28:12 localhost podman[264827]: 2025-12-06 10:28:12.682341986 +0000 UTC m=+0.206554283 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:28:12 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:28:14 localhost dnsmasq[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/addn_hosts - 0 addresses Dec 6 05:28:14 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/host Dec 6 05:28:14 localhost dnsmasq-dhcp[264658]: read /var/lib/neutron/dhcp/873d6d35-acd8-44ae-bfa2-18285967f797/opts Dec 6 05:28:14 localhost podman[264883]: 2025-12-06 10:28:14.074470092 +0000 UTC m=+0.065950773 container kill d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Dec 6 05:28:14 localhost nova_compute[237281]: 2025-12-06 10:28:14.338 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:14 localhost kernel: device tapcc4d5c33-1e left promiscuous mode Dec 6 05:28:14 localhost ovn_controller[131684]: 2025-12-06T10:28:14Z|00464|binding|INFO|Releasing lport cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721 from this chassis (sb_readonly=0) Dec 6 05:28:14 localhost ovn_controller[131684]: 2025-12-06T10:28:14Z|00465|binding|INFO|Setting lport cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721 down in Southbound Dec 6 05:28:14 localhost nova_compute[237281]: 2025-12-06 10:28:14.754 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:14.777 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-873d6d35-acd8-44ae-bfa2-18285967f797', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-873d6d35-acd8-44ae-bfa2-18285967f797', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab98f9d3-c4c1-47f7-b206-862e9571ef2d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:28:14 localhost nova_compute[237281]: 2025-12-06 10:28:14.778 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:14.781 137259 INFO neutron.agent.ovn.metadata.agent [-] Port cc4d5c33-1e0f-4787-a0b9-ec2e8d5af721 in datapath 873d6d35-acd8-44ae-bfa2-18285967f797 unbound from our chassis#033[00m Dec 6 05:28:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:14.784 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 873d6d35-acd8-44ae-bfa2-18285967f797, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:28:14 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:14.785 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[38a57cb7-86f6-4122-9729-9cf9f1719fa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:28:15 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:15.023 2 INFO neutron.agent.securitygroups_rpc [None req-95b2c84c-4eee-4200-b91b-63634f20972a 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['706fd791-8d2c-46d5-9dcd-3f2daf88d26e']#033[00m Dec 6 05:28:15 localhost nova_compute[237281]: 2025-12-06 10:28:15.068 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:15 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:15.071 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:28:15 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:15.072 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:28:15 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:15.967 2 INFO neutron.agent.securitygroups_rpc [None req-af15e778-322c-4846-b958-37f6adba63fa 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['706fd791-8d2c-46d5-9dcd-3f2daf88d26e']#033[00m Dec 6 05:28:16 localhost openstack_network_exporter[199751]: ERROR 10:28:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:28:16 localhost openstack_network_exporter[199751]: ERROR 10:28:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:28:16 localhost openstack_network_exporter[199751]: ERROR 10:28:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:28:16 localhost openstack_network_exporter[199751]: ERROR 10:28:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:28:16 localhost openstack_network_exporter[199751]: Dec 6 05:28:16 localhost openstack_network_exporter[199751]: ERROR 10:28:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:28:16 localhost openstack_network_exporter[199751]: Dec 6 05:28:16 localhost nova_compute[237281]: 2025-12-06 10:28:16.846 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:28:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:28:17 localhost podman[264907]: 2025-12-06 10:28:17.555607088 +0000 UTC m=+0.083906305 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 05:28:17 localhost podman[264907]: 2025-12-06 10:28:17.568246277 +0000 UTC m=+0.096545504 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:28:17 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:28:17 localhost dnsmasq[264658]: exiting on receipt of SIGTERM Dec 6 05:28:17 localhost systemd[1]: libpod-d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f.scope: Deactivated successfully. Dec 6 05:28:17 localhost podman[264953]: 2025-12-06 10:28:17.727033958 +0000 UTC m=+0.065439806 container kill d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:28:17 localhost podman[264908]: 2025-12-06 10:28:17.718056721 +0000 UTC m=+0.243812190 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible) Dec 6 05:28:17 localhost podman[264972]: 2025-12-06 10:28:17.798912402 +0000 UTC m=+0.060096363 container died d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:28:17 localhost podman[264972]: 2025-12-06 10:28:17.841774602 +0000 UTC m=+0.102958513 container cleanup d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Dec 6 05:28:17 localhost systemd[1]: libpod-conmon-d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f.scope: Deactivated successfully. Dec 6 05:28:17 localhost podman[264908]: 2025-12-06 10:28:17.852800461 +0000 UTC m=+0.378555970 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 05:28:17 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:28:17 localhost podman[264979]: 2025-12-06 10:28:17.895912729 +0000 UTC m=+0.138032812 container remove d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-873d6d35-acd8-44ae-bfa2-18285967f797, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:28:18 localhost ovn_controller[131684]: 2025-12-06T10:28:18Z|00466|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:28:18 localhost nova_compute[237281]: 2025-12-06 10:28:18.334 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:18 localhost systemd[1]: tmp-crun.q31Z9N.mount: Deactivated successfully. Dec 6 05:28:18 localhost systemd[1]: var-lib-containers-storage-overlay-a8b860880b2ccff72e53885ed639178c53299f6f77e9737429d9e1e87dbdbbbc-merged.mount: Deactivated successfully. Dec 6 05:28:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d76e593f354397f82c407bce4767b97d23d8f656720a0fc7827eb15cd1db218f-userdata-shm.mount: Deactivated successfully. Dec 6 05:28:18 localhost systemd[1]: run-netns-qdhcp\x2d873d6d35\x2dacd8\x2d44ae\x2dbfa2\x2d18285967f797.mount: Deactivated successfully. Dec 6 05:28:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:18.805 219384 INFO neutron.agent.dhcp.agent [None req-eed1bffb-ff6c-4c9b-9add-5ea8647bc07a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:18.806 219384 INFO neutron.agent.dhcp.agent [None req-eed1bffb-ff6c-4c9b-9add-5ea8647bc07a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:19 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:19.224 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:19 localhost nova_compute[237281]: 2025-12-06 10:28:19.341 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:20 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:20.073 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:28:21 localhost nova_compute[237281]: 2025-12-06 10:28:21.848 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:21 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:21.889 2 INFO neutron.agent.securitygroups_rpc [None req-805aa17d-073e-4cce-bd8b-0d97ba7177f1 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['c1413bcd-d38e-460c-9cb8-b261de636fc0']#033[00m Dec 6 05:28:21 localhost dnsmasq[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/addn_hosts - 0 addresses Dec 6 05:28:21 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/host Dec 6 05:28:21 localhost podman[265018]: 2025-12-06 10:28:21.914886871 +0000 UTC m=+0.044228633 container kill 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:28:21 localhost dnsmasq-dhcp[264317]: read /var/lib/neutron/dhcp/554f8bfc-4948-4da6-b83f-b76a7fcdb636/opts Dec 6 05:28:22 localhost ovn_controller[131684]: 2025-12-06T10:28:22Z|00467|binding|INFO|Releasing lport 1b48f4e5-985c-4b80-a240-4e134742d5e5 from this chassis (sb_readonly=0) Dec 6 05:28:22 localhost kernel: device tap1b48f4e5-98 left promiscuous mode Dec 6 05:28:22 localhost ovn_controller[131684]: 2025-12-06T10:28:22Z|00468|binding|INFO|Setting lport 1b48f4e5-985c-4b80-a240-4e134742d5e5 down in Southbound Dec 6 05:28:22 localhost nova_compute[237281]: 2025-12-06 10:28:22.875 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:22 localhost nova_compute[237281]: 2025-12-06 10:28:22.907 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:22.908 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-554f8bfc-4948-4da6-b83f-b76a7fcdb636', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-554f8bfc-4948-4da6-b83f-b76a7fcdb636', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '077494e5b56a49e2a7c273a073b20032', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2df9c0e2-a153-40e2-8e68-ef3878ced8bb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1b48f4e5-985c-4b80-a240-4e134742d5e5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:28:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:22.911 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 1b48f4e5-985c-4b80-a240-4e134742d5e5 in datapath 554f8bfc-4948-4da6-b83f-b76a7fcdb636 unbound from our chassis#033[00m Dec 6 05:28:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:22.913 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 554f8bfc-4948-4da6-b83f-b76a7fcdb636 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:28:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:22.913 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[f2986da6-a187-40a8-850a-507d66908a3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:28:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:22.997 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:28:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:22.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.045 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.047 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '525b9c90-0a37-4ce5-bba5-22da2957b56b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:22.998673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4aca9b88-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '54257067637a1fa4137f7fc8ae9626bf4d97a9dd17fb4a013159ad8be943b20d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:22.998673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4acab398-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '86dac9deccd38391fdc243a137e3b904ae46f8956c38bdac9f948a6524ac058f'}]}, 'timestamp': '2025-12-06 10:28:23.047494', '_unique_id': '328ba1c0cf86410198f088e3cbbda9e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.049 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.051 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.073 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 21340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d19b1bc-d475-4353-a5ae-c4b867fce0db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21340000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:28:23.051328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4aceca96-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.283765401, 'message_signature': '5bfe7bc5154cb02cb00a62c0b9c3c3e60b8a7c5d94afe8b76f511d1adc088085'}]}, 'timestamp': '2025-12-06 10:28:23.074303', '_unique_id': 'abdef0daa98b4c75b11eba8e6193c8fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.075 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.076 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef72c172-9e06-41a7-9d75-43094af06ef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.076688', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4acfb6b8-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': '88c5b266164d3735fd1c0b00bc76adccd3de9f03cca70f5faa45141099764c20'}]}, 'timestamp': '2025-12-06 10:28:23.080352', '_unique_id': '0506cfdee9dd40338baa48a5c72e60e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.081 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.083 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82da8bf2-12ee-40db-b0f1-287be0133a8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.083017', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad031a6-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': '66ab6e3fe0dffefad3733113169e78a5ff3769369926208f43527b01d324796c'}]}, 'timestamp': '2025-12-06 10:28:23.083497', '_unique_id': '80d33299c12b4b8fa3280c9c6257f67f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.084 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.086 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad8ed0a9-46db-48f7-9a38-3683bdfa76ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.085774', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ad09f9c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '84d56587b7271449bcff40f34bd27fdcfa0f3291376f25392435376a1df2a6bf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.085774', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ad0b02c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '0407a16803c47cb5afcdb92add02f8e924728c7e2ea4ea62fdb3b412ae434003'}]}, 'timestamp': '2025-12-06 10:28:23.086697', '_unique_id': 'aa0df8c8370242908a518a0ce40e5e37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.087 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c54f688-45f4-4080-86e2-c7a5470cea35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.089024', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad11c6a-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': '2e72bad1fae084ca2a01a45510e23a186e72c1afa6377398c95a09864ca8cd94'}]}, 'timestamp': '2025-12-06 10:28:23.089526', '_unique_id': '593caad35074458b96e17d0e75f3bf06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.090 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.108 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.109 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34f6d492-5025-4423-8cb1-fa29e3018b2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.091794', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ad42018-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.301935461, 'message_signature': '78580b0c611e3eaa195acb0f5529d779264237da00e1472552197361cc72be1c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.091794', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ad438c8-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.301935461, 'message_signature': 'dd76986d897424a354b4b077ec33c48da984c58a3ebe68f3ec42ecb9cb62ae33'}]}, 'timestamp': '2025-12-06 10:28:23.109916', '_unique_id': 'e3f41f8a164148398512dea306a6d61e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.111 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.112 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ccc1c40-e26e-4749-ae81-d870e1df9cec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.112684', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad4b9ce-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': 'de00800bc7e3de2ef623879f5228a0c405eb7900f372481ae9493747f7dd4c55'}]}, 'timestamp': '2025-12-06 10:28:23.113199', '_unique_id': '01c6dbf3131e4a7c83d2d159ae62873d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.114 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.115 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.116 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb2fa36f-26b9-4657-bfcb-2ec01af8545d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.115531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ad528dc-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.301935461, 'message_signature': '875b3c2902a164e1f44ac5eaf2a2c9b5a368ba1056d34f3808cddba21bdc259b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.115531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ad53c28-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.301935461, 'message_signature': '8b8a90bff1d7dfdb9bbdded793adb4b8892e7eade128330f9130943712255c67'}]}, 'timestamp': '2025-12-06 10:28:23.116505', '_unique_id': 'f3ade91fe3164700b06d26bc6f830423'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.117 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.119 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.119 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6f86d36-2390-4cfd-bb0f-588cb07e2731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.119001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ad5af6e-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '37e3f101b20b457f17d5bb5935125a3828ce27d7c2c5902089869b466bdbef3f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.119001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ad5c080-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': 'e06887761249b6382a854e913d14faecbe7e0b418a32d7f8f74072ea638f2d0c'}]}, 'timestamp': '2025-12-06 10:28:23.119923', '_unique_id': '8393879358f94f2ca3556e0a01184444'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.120 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47b962a3-8d6f-4ca9-abfa-499b95c5f86b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.122422', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ad63560-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': 'f5d9b4ba664d3fd32c80221e461f7b2a222512987a6a14fe02ffd384b3d337e7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.122422', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ad64ac8-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '3965db57fd5a4d6dda8c4c4082811b95955a167d8b7717200e72fb90ded28a6f'}]}, 'timestamp': '2025-12-06 10:28:23.123441', '_unique_id': 'e4a94f4adf4948fc9bfe6bbc2df0bdcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f955cba-e0c2-4c43-8b6a-395994eb6503', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.125884', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad6bc9c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': '917fbb730e1577285509f6c35567e8fb765351a9ac2f1800d6dd38b16f16a29e'}]}, 'timestamp': '2025-12-06 10:28:23.126376', '_unique_id': '1a5a2e068a7345af896a79a406621d79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.128 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26db4327-eb58-4b72-a473-aa88b669d922', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.128568', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad72524-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': 'b220f57f93097fd921ab7508f4f6499f69d24da2d762f749d0df19cccbc2c5d5'}]}, 'timestamp': '2025-12-06 10:28:23.129080', '_unique_id': '0728b0449e1a44d9b2cd4c836ddecfad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.131 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66c03d07-6721-412b-9e5d-22277d589ce9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.131426', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad79464-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': 'cddc3d84bf8cca8cf7f9e0b00c664b331fd10f06dcea295da234da9f3469d45c'}]}, 'timestamp': '2025-12-06 10:28:23.131924', '_unique_id': 'f7151a2865f04b52804e2affa6dd330d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2848c39-38f1-41ae-8a6c-bd763b62ae63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.134167', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad7ff80-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': '238131cf9a6c1ab37d552d37fc8f5aace5989d5ee502d0d549f17668dff353cc'}]}, 'timestamp': '2025-12-06 10:28:23.134633', '_unique_id': '44019fbcd89543d6985cc82fbeb47902'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.135 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.136 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.136 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.137 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21f4ab30-01a7-4792-979c-7b5fa239a077', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.136951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ad86c40-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '2e9da5578693ed7d586203cd1c74d00b714ebe05e772125f7e888614732e0d81'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.136951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ad87df2-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': 'def7331f75cb8a642e28397604fd3b85c99e034ec91efe94eaabbd80ca24f41d'}]}, 'timestamp': '2025-12-06 10:28:23.137874', '_unique_id': '55311fb97c714bf18f8620488920e0e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.138 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.139 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.140 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95e61f63-ebfb-484a-90d5-7923b066d695', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:28:23.140055', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4ad8e562-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.283765401, 'message_signature': '7af818c0fccc0e2dcf715e29b26603776813860ba24ab882abb731ce4a8686a5'}]}, 'timestamp': '2025-12-06 10:28:23.140501', '_unique_id': '70991570471d4daa82213ab930948f7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.143 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.143 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f5d1c24-a2ad-4ed7-9f22-913c182e9b66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.143109', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ad95d08-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.301935461, 'message_signature': '87d21042c614166f6ee2491e6ad06acc839918928a3e04fd1e3bb859d72791ca'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.143109', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ad96dfc-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.301935461, 'message_signature': '7e39ea0dcf6a04fe733a21317fe1f2b0599ade5c34e89c0b1e30cf4c7955771a'}]}, 'timestamp': '2025-12-06 10:28:23.144024', '_unique_id': '3ff546499e3249059b3e34556b5b3e6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.146 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60bb1fbe-5acd-4043-941f-fa713a101675', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.146676', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ad9e55c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': 'b327631ed2f5fa363c1fb8b4efb8853b923e98c7fef08d9ad74051f1db82b880'}]}, 'timestamp': '2025-12-06 10:28:23.147013', '_unique_id': '745db1dd48ec4768b0802a74c66f18c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.147 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.148 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.148 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13929367-c98b-4333-b508-b086191966b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:28:23.148384', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4ada276a-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': '0bade20b1b4668793ea62175e438b86e2d9d25965e97159e975d92a61a91ffcd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:28:23.148384', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4ada325a-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.208821913, 'message_signature': 'a0d82fa31014d9a6867cbdd0b9b7f480693175e37e16bbff5aa9f5a26672e53b'}]}, 'timestamp': '2025-12-06 10:28:23.148966', '_unique_id': '44fbea607bc54d609acd4e42dd3f69ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.149 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.150 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0660d24f-6001-4c4e-b341-898d782a9c48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:28:23.150387', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '4ada75d0-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13290.286803955, 'message_signature': 'dbe822c21673506616be0f241ae83bfd7368505f1645f7d913f0a4446c1c266c'}]}, 'timestamp': '2025-12-06 10:28:23.150686', '_unique_id': 'ed327430dbde44298302d38696a83be6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:28:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:28:23.152 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:28:23 localhost podman[197801]: time="2025-12-06T10:28:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:28:23 localhost podman[197801]: @ - - [06/Dec/2025:10:28:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147785 "" "Go-http-client/1.1" Dec 6 05:28:23 localhost podman[197801]: @ - - [06/Dec/2025:10:28:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16930 "" "Go-http-client/1.1" Dec 6 05:28:23 localhost nova_compute[237281]: 2025-12-06 10:28:23.482 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:23 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:23.612 2 INFO neutron.agent.securitygroups_rpc [None req-1938a2f7-b0c8-4cce-9cf1-9618d43bc77a 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['c1413bcd-d38e-460c-9cb8-b261de636fc0']#033[00m Dec 6 05:28:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4069 DF PROTO=TCP SPT=46158 DPT=9102 SEQ=399057658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE2E1490000000001030307) Dec 6 05:28:24 localhost nova_compute[237281]: 2025-12-06 10:28:24.342 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4070 DF PROTO=TCP SPT=46158 DPT=9102 SEQ=399057658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE2E5470000000001030307) Dec 6 05:28:25 localhost dnsmasq[264317]: exiting on receipt of SIGTERM Dec 6 05:28:25 localhost podman[265057]: 2025-12-06 10:28:25.454079994 +0000 UTC m=+0.058180283 container kill 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:28:25 localhost systemd[1]: libpod-456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63.scope: Deactivated successfully. Dec 6 05:28:25 localhost podman[265070]: 2025-12-06 10:28:25.528825686 +0000 UTC m=+0.059550255 container died 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:28:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63-userdata-shm.mount: Deactivated successfully. Dec 6 05:28:25 localhost podman[265070]: 2025-12-06 10:28:25.562618437 +0000 UTC m=+0.093342976 container cleanup 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:28:25 localhost systemd[1]: libpod-conmon-456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63.scope: Deactivated successfully. Dec 6 05:28:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44664 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=761886991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE2E7880000000001030307) Dec 6 05:28:25 localhost podman[265074]: 2025-12-06 10:28:25.616355452 +0000 UTC m=+0.135185615 container remove 456b59d25e9a8538e0a300a3c439843e8fa5ad84b90480757de56695f7c1ef63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-554f8bfc-4948-4da6-b83f-b76a7fcdb636, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:28:26 localhost systemd[1]: var-lib-containers-storage-overlay-c23eee3e52cdd787824232e8d45875c6a97192fab7ad5256fc432da3a710dd2a-merged.mount: Deactivated successfully. Dec 6 05:28:26 localhost systemd[1]: run-netns-qdhcp\x2d554f8bfc\x2d4948\x2d4da6\x2db83f\x2db76a7fcdb636.mount: Deactivated successfully. Dec 6 05:28:26 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:26.528 219384 INFO neutron.agent.dhcp.agent [None req-ebea9611-b025-43d4-b66d-68aace357415 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:26 localhost nova_compute[237281]: 2025-12-06 10:28:26.852 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:27.044 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4071 DF PROTO=TCP SPT=46158 DPT=9102 SEQ=399057658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE2ED470000000001030307) Dec 6 05:28:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:28:27 localhost podman[265095]: 2025-12-06 10:28:27.54949209 +0000 UTC m=+0.085457553 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible) Dec 6 05:28:27 localhost podman[265095]: 2025-12-06 10:28:27.55988503 +0000 UTC m=+0.095850493 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git) Dec 6 05:28:27 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:28:27 localhost ovn_controller[131684]: 2025-12-06T10:28:27Z|00469|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:28:27 localhost nova_compute[237281]: 2025-12-06 10:28:27.686 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3976 DF PROTO=TCP SPT=59696 DPT=9102 SEQ=1362246866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE2F1870000000001030307) Dec 6 05:28:28 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:28.619 2 INFO neutron.agent.securitygroups_rpc [None req-8857e2af-6e9d-484b-a27c-6abf34e2ee24 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['3aa3a52f-9e27-4962-9f7e-bcb492ad91f6']#033[00m Dec 6 05:28:29 localhost nova_compute[237281]: 2025-12-06 10:28:29.345 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:30 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:30.138 2 INFO neutron.agent.securitygroups_rpc [None req-7607ef6d-be33-45a9-9c73-1bbc1afac25a 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['3aa3a52f-9e27-4962-9f7e-bcb492ad91f6']#033[00m Dec 6 05:28:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4072 DF PROTO=TCP SPT=46158 DPT=9102 SEQ=399057658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE2FD070000000001030307) Dec 6 05:28:31 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:31.608 2 INFO neutron.agent.securitygroups_rpc [None req-ec301c41-8a1b-40bb-8ba2-2407089f0bb6 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['3aa3a52f-9e27-4962-9f7e-bcb492ad91f6']#033[00m Dec 6 05:28:31 localhost nova_compute[237281]: 2025-12-06 10:28:31.855 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:28:32 localhost podman[265115]: 2025-12-06 10:28:32.573965971 +0000 UTC m=+0.102606242 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:28:32 localhost podman[265115]: 2025-12-06 10:28:32.580214233 +0000 UTC m=+0.108854504 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:28:32 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:28:32 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:32.655 2 INFO neutron.agent.securitygroups_rpc [None req-934d7292-2118-485c-b980-5fd76131d63b 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['495a9aea-2aa1-4b21-94e0-6d1f8f1d63b6']#033[00m Dec 6 05:28:33 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:33.468 2 INFO neutron.agent.securitygroups_rpc [None req-0295cb1d-dde0-467b-90e1-ead31cab67f0 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['3aa3a52f-9e27-4962-9f7e-bcb492ad91f6']#033[00m Dec 6 05:28:33 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:33.570 2 INFO neutron.agent.securitygroups_rpc [None req-2ad0c6e5-95ad-4d19-996b-9c3f65e11043 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['495a9aea-2aa1-4b21-94e0-6d1f8f1d63b6']#033[00m Dec 6 05:28:33 localhost ovn_controller[131684]: 2025-12-06T10:28:33Z|00470|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:28:33 localhost nova_compute[237281]: 2025-12-06 10:28:33.639 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:34 localhost nova_compute[237281]: 2025-12-06 10:28:34.347 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:34 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:34.901 2 INFO neutron.agent.securitygroups_rpc [None req-dd73349b-65f7-4a9c-abe4-0301ed36d388 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['3aa3a52f-9e27-4962-9f7e-bcb492ad91f6']#033[00m Dec 6 05:28:35 localhost podman[265156]: 2025-12-06 10:28:35.912102771 +0000 UTC m=+0.065340573 container kill 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:28:35 localhost dnsmasq[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/addn_hosts - 0 addresses Dec 6 05:28:35 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/host Dec 6 05:28:35 localhost dnsmasq-dhcp[263657]: read /var/lib/neutron/dhcp/21e92d95-c8a1-4ccf-8e3f-294606fc7261/opts Dec 6 05:28:35 localhost systemd[1]: tmp-crun.rJpTuQ.mount: Deactivated successfully. Dec 6 05:28:36 localhost ovn_controller[131684]: 2025-12-06T10:28:36Z|00471|binding|INFO|Releasing lport 0bf76b92-a0bd-422d-af31-91d15aa2cd54 from this chassis (sb_readonly=0) Dec 6 05:28:36 localhost kernel: device tap0bf76b92-a0 left promiscuous mode Dec 6 05:28:36 localhost ovn_controller[131684]: 2025-12-06T10:28:36Z|00472|binding|INFO|Setting lport 0bf76b92-a0bd-422d-af31-91d15aa2cd54 down in Southbound Dec 6 05:28:36 localhost nova_compute[237281]: 2025-12-06 10:28:36.182 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:36.197 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-21e92d95-c8a1-4ccf-8e3f-294606fc7261', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-21e92d95-c8a1-4ccf-8e3f-294606fc7261', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '077494e5b56a49e2a7c273a073b20032', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c8a331d-c7fd-4791-939c-e2225299c794, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0bf76b92-a0bd-422d-af31-91d15aa2cd54) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:28:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:36.199 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 0bf76b92-a0bd-422d-af31-91d15aa2cd54 in datapath 21e92d95-c8a1-4ccf-8e3f-294606fc7261 unbound from our chassis#033[00m Dec 6 05:28:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:36.200 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 21e92d95-c8a1-4ccf-8e3f-294606fc7261 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:28:36 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:36.202 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[06149a0b-64ff-4144-8a37-ee2cf41b6107]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:28:36 localhost nova_compute[237281]: 2025-12-06 10:28:36.209 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:36 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:36.276 2 INFO neutron.agent.securitygroups_rpc [None req-5187579b-678e-4964-850d-a894f5f62b62 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['3aa3a52f-9e27-4962-9f7e-bcb492ad91f6']#033[00m Dec 6 05:28:36 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:36.444 2 INFO neutron.agent.securitygroups_rpc [None req-ce68f174-c21c-4096-9bed-2366e93ed78e 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:36 localhost nova_compute[237281]: 2025-12-06 10:28:36.857 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:38 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:38.007 2 INFO neutron.agent.securitygroups_rpc [None req-d1fa58ec-eb54-4194-adea-54a0cd68fbda 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:38 localhost dnsmasq[263657]: exiting on receipt of SIGTERM Dec 6 05:28:38 localhost podman[265194]: 2025-12-06 10:28:38.039148083 +0000 UTC m=+0.063852298 container kill 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 6 05:28:38 localhost systemd[1]: libpod-4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e.scope: Deactivated successfully. Dec 6 05:28:38 localhost podman[265206]: 2025-12-06 10:28:38.128177594 +0000 UTC m=+0.074097553 container died 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 6 05:28:38 localhost podman[265206]: 2025-12-06 10:28:38.163075749 +0000 UTC m=+0.108995678 container cleanup 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:28:38 localhost systemd[1]: libpod-conmon-4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e.scope: Deactivated successfully. Dec 6 05:28:38 localhost podman[265208]: 2025-12-06 10:28:38.20271975 +0000 UTC m=+0.137212077 container remove 4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-21e92d95-c8a1-4ccf-8e3f-294606fc7261, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:28:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:38.233 219384 INFO neutron.agent.dhcp.agent [None req-efd5d62a-d437-4a1c-919d-d616d8b99a8c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:39 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:39.033 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:28:39 localhost systemd[1]: var-lib-containers-storage-overlay-95067c65d9505423cf514e60a06ac59857bd1f2f1e79c08bcde3525cf96a6569-merged.mount: Deactivated successfully. Dec 6 05:28:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4eed49f2d0001fa5e0dfcee847bcf2d8a17bc3937a336c2c4ba7d074203b4a1e-userdata-shm.mount: Deactivated successfully. Dec 6 05:28:39 localhost systemd[1]: run-netns-qdhcp\x2d21e92d95\x2dc8a1\x2d4ccf\x2d8e3f\x2d294606fc7261.mount: Deactivated successfully. Dec 6 05:28:39 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:39.056 2 INFO neutron.agent.securitygroups_rpc [None req-105c177e-f6d4-4b6a-b705-2ac657553685 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:39 localhost ovn_controller[131684]: 2025-12-06T10:28:39Z|00473|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:28:39 localhost nova_compute[237281]: 2025-12-06 10:28:39.313 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:39 localhost nova_compute[237281]: 2025-12-06 10:28:39.349 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4073 DF PROTO=TCP SPT=46158 DPT=9102 SEQ=399057658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE31D880000000001030307) Dec 6 05:28:39 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:39.465 2 INFO neutron.agent.securitygroups_rpc [None req-5cd2c807-57de-4644-9306-8f96e8d68cdd 6c7454fdafb44011b8e9feef9756c691 8cc2d6d72f2746858fe2176e852e3ae5 - - default default] Security group rule updated ['6efce53b-25c7-43b7-be81-1a3b73c5b469']#033[00m Dec 6 05:28:39 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:39.522 2 INFO neutron.agent.securitygroups_rpc [None req-b590736e-55ff-473a-ba90-a3543847676a 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:28:40 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:40.490 2 INFO neutron.agent.securitygroups_rpc [None req-2b46d2a4-5310-4e81-bc3f-f954fdc272be 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:40 localhost podman[265236]: 2025-12-06 10:28:40.548934822 +0000 UTC m=+0.083176442 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3) Dec 6 05:28:40 localhost podman[265236]: 2025-12-06 10:28:40.635295652 +0000 UTC m=+0.169537282 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:28:40 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:28:40 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:40.859 2 INFO neutron.agent.securitygroups_rpc [None req-cf028e09-71ea-4558-aec8-a8ba88c4fb3a 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:41 localhost nova_compute[237281]: 2025-12-06 10:28:41.860 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:42 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:42.007 2 INFO neutron.agent.securitygroups_rpc [None req-a7fb98df-bd98-4b10-a458-3c94131ced2e 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:42 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:42.898 2 INFO neutron.agent.securitygroups_rpc [None req-ccc1dba4-056c-4b92-969e-fa8a572ef433 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:28:43 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:43.481 2 INFO neutron.agent.securitygroups_rpc [None req-2b20ea12-db4f-43f7-ac7e-05997f3b95d5 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:43 localhost podman[265261]: 2025-12-06 10:28:43.546750675 +0000 UTC m=+0.079470029 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:28:43 localhost podman[265261]: 2025-12-06 10:28:43.558083774 +0000 UTC m=+0.090803128 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:28:43 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:28:43 localhost systemd[1]: tmp-crun.QjikIp.mount: Deactivated successfully. Dec 6 05:28:43 localhost podman[265262]: 2025-12-06 10:28:43.613023595 +0000 UTC m=+0.141937452 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:28:43 localhost podman[265262]: 2025-12-06 10:28:43.622816988 +0000 UTC m=+0.151730855 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:28:43 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:28:44 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:44.130 2 INFO neutron.agent.securitygroups_rpc [None req-5e34ead0-42a8-4358-ba50-40a226c82134 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['b285af7a-9ea0-4847-8b8e-ea3627e21686']#033[00m Dec 6 05:28:44 localhost nova_compute[237281]: 2025-12-06 10:28:44.354 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:46 localhost openstack_network_exporter[199751]: ERROR 10:28:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:28:46 localhost openstack_network_exporter[199751]: ERROR 10:28:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:28:46 localhost openstack_network_exporter[199751]: ERROR 10:28:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:28:46 localhost openstack_network_exporter[199751]: ERROR 10:28:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:28:46 localhost openstack_network_exporter[199751]: Dec 6 05:28:46 localhost openstack_network_exporter[199751]: ERROR 10:28:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:28:46 localhost openstack_network_exporter[199751]: Dec 6 05:28:46 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:46.585 2 INFO neutron.agent.securitygroups_rpc [None req-e3361c9c-6502-4ec6-b487-85bce75baa18 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['5c6f4886-8216-483d-9aad-fed6a791016c']#033[00m Dec 6 05:28:46 localhost nova_compute[237281]: 2025-12-06 10:28:46.861 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:28:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:28:48 localhost podman[265305]: 2025-12-06 10:28:48.556043648 +0000 UTC m=+0.087137365 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm) Dec 6 05:28:48 localhost podman[265305]: 2025-12-06 10:28:48.596400931 +0000 UTC m=+0.127494668 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 6 05:28:48 localhost podman[265304]: 2025-12-06 10:28:48.608664669 +0000 UTC m=+0.143909274 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Dec 6 05:28:48 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:28:48 localhost podman[265304]: 2025-12-06 10:28:48.643371598 +0000 UTC m=+0.178616213 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:28:48 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:28:49 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:49.074 2 INFO neutron.agent.securitygroups_rpc [None req-357d23b1-10c0-4c7c-9f9b-f4529993b79b 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['084570b5-9023-4703-94cb-e9809802e4da']#033[00m Dec 6 05:28:49 localhost nova_compute[237281]: 2025-12-06 10:28:49.358 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:51 localhost nova_compute[237281]: 2025-12-06 10:28:51.865 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:52 localhost nova_compute[237281]: 2025-12-06 10:28:52.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:52 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:52.907 2 INFO neutron.agent.securitygroups_rpc [None req-740ca41a-88cc-42f8-a9e1-a18b3b294470 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['084570b5-9023-4703-94cb-e9809802e4da']#033[00m Dec 6 05:28:53 localhost podman[197801]: time="2025-12-06T10:28:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:28:53 localhost podman[197801]: @ - - [06/Dec/2025:10:28:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:28:53 localhost podman[197801]: @ - - [06/Dec/2025:10:28:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15972 "" "Go-http-client/1.1" Dec 6 05:28:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45991 DF PROTO=TCP SPT=51974 DPT=9102 SEQ=3472360902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE3567A0000000001030307) Dec 6 05:28:54 localhost nova_compute[237281]: 2025-12-06 10:28:54.371 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45992 DF PROTO=TCP SPT=51974 DPT=9102 SEQ=3472360902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE35A870000000001030307) Dec 6 05:28:55 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:55.548 219384 INFO neutron.agent.linux.ip_lib [None req-c21b23d6-603d-4e56-8b2a-e62f1b285450 - - - - - -] Device tapdf76bb28-3f cannot be used as it has no MAC address#033[00m Dec 6 05:28:55 localhost nova_compute[237281]: 2025-12-06 10:28:55.614 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:55 localhost kernel: device tapdf76bb28-3f entered promiscuous mode Dec 6 05:28:55 localhost ovn_controller[131684]: 2025-12-06T10:28:55Z|00474|binding|INFO|Claiming lport df76bb28-3f42-4c01-a3f9-c8ec6a089f9c for this chassis. Dec 6 05:28:55 localhost ovn_controller[131684]: 2025-12-06T10:28:55Z|00475|binding|INFO|df76bb28-3f42-4c01-a3f9-c8ec6a089f9c: Claiming unknown Dec 6 05:28:55 localhost NetworkManager[5965]: [1765016935.6271] manager: (tapdf76bb28-3f): new Generic device (/org/freedesktop/NetworkManager/Devices/77) Dec 6 05:28:55 localhost nova_compute[237281]: 2025-12-06 10:28:55.624 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:55 localhost systemd-udevd[265349]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost nova_compute[237281]: 2025-12-06 10:28:55.665 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:55 localhost ovn_controller[131684]: 2025-12-06T10:28:55Z|00476|binding|INFO|Setting lport df76bb28-3f42-4c01-a3f9-c8ec6a089f9c ovn-installed in OVS Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost ovn_controller[131684]: 2025-12-06T10:28:55Z|00477|binding|INFO|Setting lport df76bb28-3f42-4c01-a3f9-c8ec6a089f9c up in Southbound Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:55.679 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-3898807e-38ab-4604-999c-d6177473b70f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3898807e-38ab-4604-999c-d6177473b70f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=813fe017-af6c-4692-aae9-aebb5bb816e6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=df76bb28-3f42-4c01-a3f9-c8ec6a089f9c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:28:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:55.681 137259 INFO neutron.agent.ovn.metadata.agent [-] Port df76bb28-3f42-4c01-a3f9-c8ec6a089f9c in datapath 3898807e-38ab-4604-999c-d6177473b70f bound to our chassis#033[00m Dec 6 05:28:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:55.684 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port 0a5657b5-4117-4f21-99ac-d9be58fd5c0c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:28:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:55.684 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3898807e-38ab-4604-999c-d6177473b70f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost ovn_metadata_agent[137254]: 2025-12-06 10:28:55.685 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[1dbdc488-f9fd-4219-be28-7d36472b0f67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost journal[186952]: ethtool ioctl error on tapdf76bb28-3f: No such device Dec 6 05:28:55 localhost nova_compute[237281]: 2025-12-06 10:28:55.714 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:55 localhost nova_compute[237281]: 2025-12-06 10:28:55.745 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4074 DF PROTO=TCP SPT=46158 DPT=9102 SEQ=399057658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE35D870000000001030307) Dec 6 05:28:55 localhost nova_compute[237281]: 2025-12-06 10:28:55.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:56 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:56.222 2 INFO neutron.agent.securitygroups_rpc [None req-a531063b-87cb-4dbe-92b4-4b43d452db2a 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['ccdc17dd-567e-432c-a22f-6290734f6c72']#033[00m Dec 6 05:28:56 localhost podman[265420]: Dec 6 05:28:56 localhost podman[265420]: 2025-12-06 10:28:56.732682993 +0000 UTC m=+0.086223637 container create c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:28:56 localhost podman[265420]: 2025-12-06 10:28:56.686461659 +0000 UTC m=+0.040002313 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:28:56 localhost systemd[1]: Started libpod-conmon-c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17.scope. Dec 6 05:28:56 localhost systemd[1]: Started libcrun container. Dec 6 05:28:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/332ab12730bb03ff051e2f43f31c45cb478d4f648adc67ab0144cea3e538e0fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:28:56 localhost podman[265420]: 2025-12-06 10:28:56.835590201 +0000 UTC m=+0.189130895 container init c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:28:56 localhost podman[265420]: 2025-12-06 10:28:56.842965849 +0000 UTC m=+0.196506493 container start c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:28:56 localhost dnsmasq[265439]: started, version 2.85 cachesize 150 Dec 6 05:28:56 localhost dnsmasq[265439]: DNS service limited to local subnets Dec 6 05:28:56 localhost dnsmasq[265439]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:28:56 localhost dnsmasq[265439]: warning: no upstream servers configured Dec 6 05:28:56 localhost dnsmasq-dhcp[265439]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:28:56 localhost dnsmasq[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/addn_hosts - 0 addresses Dec 6 05:28:56 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/host Dec 6 05:28:56 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/opts Dec 6 05:28:56 localhost nova_compute[237281]: 2025-12-06 10:28:56.866 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:56 localhost nova_compute[237281]: 2025-12-06 10:28:56.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:56 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:56.927 219384 INFO neutron.agent.dhcp.agent [None req-5dd58fc7-fb61-4c5a-ba58-a8b52c97748d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:28:55Z, description=, device_id=479e12eb-4b2d-43e0-917d-286d183c2043, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6c901016-e256-4029-ac30-0bdc117bd3f3, ip_allocation=immediate, mac_address=fa:16:3e:cf:c3:e3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:28:49Z, description=, dns_domain=, id=3898807e-38ab-4604-999c-d6177473b70f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1743839374, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49374, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2557, status=ACTIVE, subnets=['44a2393f-fad4-4db1-88aa-7bc66fdc2ce7'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:28:53Z, vlan_transparent=None, network_id=3898807e-38ab-4604-999c-d6177473b70f, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2580, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:28:55Z on network 3898807e-38ab-4604-999c-d6177473b70f#033[00m Dec 6 05:28:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:57.029 219384 INFO neutron.agent.dhcp.agent [None req-9f22a485-67be-42cd-8baf-b81a966779f9 - - - - - -] DHCP configuration for ports {'f4c5863b-c942-42a9-9744-4dbcfc29802c'} is completed#033[00m Dec 6 05:28:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45993 DF PROTO=TCP SPT=51974 DPT=9102 SEQ=3472360902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE362870000000001030307) Dec 6 05:28:57 localhost podman[265457]: 2025-12-06 10:28:57.146893489 +0000 UTC m=+0.047162093 container kill c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:28:57 localhost dnsmasq[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/addn_hosts - 1 addresses Dec 6 05:28:57 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/host Dec 6 05:28:57 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/opts Dec 6 05:28:57 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:57.265 2 INFO neutron.agent.securitygroups_rpc [None req-5c5309d9-8e6f-4e2a-95de-2df7b4bcf11c 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['ccdc17dd-567e-432c-a22f-6290734f6c72']#033[00m Dec 6 05:28:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:57.470 219384 INFO neutron.agent.dhcp.agent [None req-4e0ff206-ce5d-48f6-87b0-e5ba8b0ac913 - - - - - -] DHCP configuration for ports {'6c901016-e256-4029-ac30-0bdc117bd3f3'} is completed#033[00m Dec 6 05:28:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:28:57 localhost podman[265477]: 2025-12-06 10:28:57.811434487 +0000 UTC m=+0.086069912 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter) Dec 6 05:28:57 localhost podman[265477]: 2025-12-06 10:28:57.829244165 +0000 UTC m=+0.103879570 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal) Dec 6 05:28:57 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:28:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44665 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=761886991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE365880000000001030307) Dec 6 05:28:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:58.229 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:28:55Z, description=, device_id=479e12eb-4b2d-43e0-917d-286d183c2043, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6c901016-e256-4029-ac30-0bdc117bd3f3, ip_allocation=immediate, mac_address=fa:16:3e:cf:c3:e3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:28:49Z, description=, dns_domain=, id=3898807e-38ab-4604-999c-d6177473b70f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1743839374, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49374, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2557, status=ACTIVE, subnets=['44a2393f-fad4-4db1-88aa-7bc66fdc2ce7'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:28:53Z, vlan_transparent=None, network_id=3898807e-38ab-4604-999c-d6177473b70f, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2580, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:28:55Z on network 3898807e-38ab-4604-999c-d6177473b70f#033[00m Dec 6 05:28:58 localhost dnsmasq[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/addn_hosts - 1 addresses Dec 6 05:28:58 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/host Dec 6 05:28:58 localhost podman[265513]: 2025-12-06 10:28:58.505210054 +0000 UTC m=+0.064418455 container kill c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 6 05:28:58 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/opts Dec 6 05:28:58 localhost nova_compute[237281]: 2025-12-06 10:28:58.609 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:58 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:28:58.839 219384 INFO neutron.agent.dhcp.agent [None req-d67ada21-84fa-4192-ba1e-45337b12a29d - - - - - -] DHCP configuration for ports {'6c901016-e256-4029-ac30-0bdc117bd3f3'} is completed#033[00m Dec 6 05:28:58 localhost nova_compute[237281]: 2025-12-06 10:28:58.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:59 localhost nova_compute[237281]: 2025-12-06 10:28:59.374 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:28:59 localhost neutron_sriov_agent[212548]: 2025-12-06 10:28:59.648 2 INFO neutron.agent.securitygroups_rpc [None req-90395d7e-94ef-432f-9fd2-c077a4cd025e 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['1bb68f63-644a-4814-a6e4-fa4af7bea8f9']#033[00m Dec 6 05:28:59 localhost nova_compute[237281]: 2025-12-06 10:28:59.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:28:59 localhost nova_compute[237281]: 2025-12-06 10:28:59.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:29:00 localhost neutron_sriov_agent[212548]: 2025-12-06 10:29:00.541 2 INFO neutron.agent.securitygroups_rpc [None req-52e0550a-39ad-4cc9-b145-3cb2e3fb7a61 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['1bb68f63-644a-4814-a6e4-fa4af7bea8f9']#033[00m Dec 6 05:29:01 localhost neutron_sriov_agent[212548]: 2025-12-06 10:29:01.055 2 INFO neutron.agent.securitygroups_rpc [None req-91202a33-e8b5-4af8-b2fa-de3d678972ed 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['1bb68f63-644a-4814-a6e4-fa4af7bea8f9']#033[00m Dec 6 05:29:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45994 DF PROTO=TCP SPT=51974 DPT=9102 SEQ=3472360902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE372480000000001030307) Dec 6 05:29:01 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:01.468 219384 INFO neutron.agent.linux.ip_lib [None req-4f3ef52d-9c75-403d-b449-aa9575f72921 - - - - - -] Device tapd04fddf3-50 cannot be used as it has no MAC address#033[00m Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.497 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:01 localhost kernel: device tapd04fddf3-50 entered promiscuous mode Dec 6 05:29:01 localhost NetworkManager[5965]: [1765016941.5058] manager: (tapd04fddf3-50): new Generic device (/org/freedesktop/NetworkManager/Devices/78) Dec 6 05:29:01 localhost ovn_controller[131684]: 2025-12-06T10:29:01Z|00478|binding|INFO|Claiming lport d04fddf3-500a-4566-a6b3-9ea8a88c0fd2 for this chassis. Dec 6 05:29:01 localhost ovn_controller[131684]: 2025-12-06T10:29:01Z|00479|binding|INFO|d04fddf3-500a-4566-a6b3-9ea8a88c0fd2: Claiming unknown Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.508 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:01 localhost systemd-udevd[265544]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:29:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:01.523 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-bc5b5961-46c8-445d-b480-1d7b7148ce85', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc5b5961-46c8-445d-b480-1d7b7148ce85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9333e497c33f48709838789e8bdc913e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5141b29f-e554-4e6b-b87e-1b0924fe6749, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d04fddf3-500a-4566-a6b3-9ea8a88c0fd2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:01.525 137259 INFO neutron.agent.ovn.metadata.agent [-] Port d04fddf3-500a-4566-a6b3-9ea8a88c0fd2 in datapath bc5b5961-46c8-445d-b480-1d7b7148ce85 bound to our chassis#033[00m Dec 6 05:29:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:01.527 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network bc5b5961-46c8-445d-b480-1d7b7148ce85 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:29:01 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:01.529 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c565e2-fe46-4983-b448-7dd8072c2c5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:29:01 localhost ovn_controller[131684]: 2025-12-06T10:29:01Z|00480|binding|INFO|Setting lport d04fddf3-500a-4566-a6b3-9ea8a88c0fd2 ovn-installed in OVS Dec 6 05:29:01 localhost ovn_controller[131684]: 2025-12-06T10:29:01Z|00481|binding|INFO|Setting lport d04fddf3-500a-4566-a6b3-9ea8a88c0fd2 up in Southbound Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.550 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.556 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.594 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:01 localhost neutron_sriov_agent[212548]: 2025-12-06 10:29:01.605 2 INFO neutron.agent.securitygroups_rpc [None req-4e35aa05-7509-492e-8d4e-f5ff63137170 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['1bb68f63-644a-4814-a6e4-fa4af7bea8f9']#033[00m Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.626 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.870 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:01 localhost nova_compute[237281]: 2025-12-06 10:29:01.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:02 localhost podman[265599]: Dec 6 05:29:02 localhost podman[265599]: 2025-12-06 10:29:02.607883354 +0000 UTC m=+0.097290958 container create 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:29:02 localhost neutron_sriov_agent[212548]: 2025-12-06 10:29:02.632 2 INFO neutron.agent.securitygroups_rpc [None req-a99156da-a831-49f2-84b4-c619893ca416 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['1bb68f63-644a-4814-a6e4-fa4af7bea8f9']#033[00m Dec 6 05:29:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:29:02 localhost systemd[1]: Started libpod-conmon-041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2.scope. Dec 6 05:29:02 localhost systemd[1]: Started libcrun container. Dec 6 05:29:02 localhost podman[265599]: 2025-12-06 10:29:02.561077252 +0000 UTC m=+0.050484886 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:29:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/595aba37042eb23debf6f00dda1034fcdc563475c7442753de251341c1b9c487/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:29:02 localhost podman[265599]: 2025-12-06 10:29:02.675227238 +0000 UTC m=+0.164634842 container init 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:29:02 localhost podman[265599]: 2025-12-06 10:29:02.684838893 +0000 UTC m=+0.174246507 container start 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:29:02 localhost dnsmasq[265629]: started, version 2.85 cachesize 150 Dec 6 05:29:02 localhost dnsmasq[265629]: DNS service limited to local subnets Dec 6 05:29:02 localhost dnsmasq[265629]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:29:02 localhost dnsmasq[265629]: warning: no upstream servers configured Dec 6 05:29:02 localhost dnsmasq-dhcp[265629]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 6 05:29:02 localhost dnsmasq[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/addn_hosts - 0 addresses Dec 6 05:29:02 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/host Dec 6 05:29:02 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/opts Dec 6 05:29:02 localhost podman[265614]: 2025-12-06 10:29:02.746833682 +0000 UTC m=+0.101394463 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:29:02 localhost podman[265614]: 2025-12-06 10:29:02.754418986 +0000 UTC m=+0.108979767 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:29:02 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:02.767 219384 INFO neutron.agent.dhcp.agent [None req-4f3ef52d-9c75-403d-b449-aa9575f72921 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:29:01Z, description=, device_id=fd356935-a909-436d-af64-c7d0101c1d37, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=81e31da0-64d4-4c52-8eac-f2d271d9054a, ip_allocation=immediate, mac_address=fa:16:3e:c7:59:ed, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:28:57Z, description=, dns_domain=, id=bc5b5961-46c8-445d-b480-1d7b7148ce85, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1972217053, port_security_enabled=True, project_id=9333e497c33f48709838789e8bdc913e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40241, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2589, status=ACTIVE, subnets=['ac2c0418-22b6-48f2-a241-b089a04b3754'], tags=[], tenant_id=9333e497c33f48709838789e8bdc913e, updated_at=2025-12-06T10:29:00Z, vlan_transparent=None, network_id=bc5b5961-46c8-445d-b480-1d7b7148ce85, port_security_enabled=False, project_id=9333e497c33f48709838789e8bdc913e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2609, status=DOWN, tags=[], tenant_id=9333e497c33f48709838789e8bdc913e, updated_at=2025-12-06T10:29:01Z on network bc5b5961-46c8-445d-b480-1d7b7148ce85#033[00m Dec 6 05:29:02 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:29:02 localhost nova_compute[237281]: 2025-12-06 10:29:02.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:02 localhost nova_compute[237281]: 2025-12-06 10:29:02.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:29:02 localhost nova_compute[237281]: 2025-12-06 10:29:02.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:29:02 localhost dnsmasq[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/addn_hosts - 1 addresses Dec 6 05:29:02 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/host Dec 6 05:29:02 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/opts Dec 6 05:29:02 localhost podman[265659]: 2025-12-06 10:29:02.994327906 +0000 UTC m=+0.071907676 container kill 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:03.066 219384 INFO neutron.agent.dhcp.agent [None req-829cab8d-78c8-446f-bffc-cc37ec03e078 - - - - - -] DHCP configuration for ports {'260b855c-c9e0-4e4f-bfb5-c24c54d5fab4'} is completed#033[00m Dec 6 05:29:03 localhost nova_compute[237281]: 2025-12-06 10:29:03.335 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:29:03 localhost nova_compute[237281]: 2025-12-06 10:29:03.336 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:29:03 localhost nova_compute[237281]: 2025-12-06 10:29:03.337 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:29:03 localhost nova_compute[237281]: 2025-12-06 10:29:03.337 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:29:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:03.343 219384 INFO neutron.agent.dhcp.agent [None req-667f65de-6e6b-496d-b753-0b6bc5553944 - - - - - -] DHCP configuration for ports {'81e31da0-64d4-4c52-8eac-f2d271d9054a'} is completed#033[00m Dec 6 05:29:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:03.436 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:29:01Z, description=, device_id=fd356935-a909-436d-af64-c7d0101c1d37, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=81e31da0-64d4-4c52-8eac-f2d271d9054a, ip_allocation=immediate, mac_address=fa:16:3e:c7:59:ed, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:28:57Z, description=, dns_domain=, id=bc5b5961-46c8-445d-b480-1d7b7148ce85, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1972217053, port_security_enabled=True, project_id=9333e497c33f48709838789e8bdc913e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40241, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2589, status=ACTIVE, subnets=['ac2c0418-22b6-48f2-a241-b089a04b3754'], tags=[], tenant_id=9333e497c33f48709838789e8bdc913e, updated_at=2025-12-06T10:29:00Z, vlan_transparent=None, network_id=bc5b5961-46c8-445d-b480-1d7b7148ce85, port_security_enabled=False, project_id=9333e497c33f48709838789e8bdc913e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2609, status=DOWN, tags=[], tenant_id=9333e497c33f48709838789e8bdc913e, updated_at=2025-12-06T10:29:01Z on network bc5b5961-46c8-445d-b480-1d7b7148ce85#033[00m Dec 6 05:29:03 localhost dnsmasq[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/addn_hosts - 1 addresses Dec 6 05:29:03 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/host Dec 6 05:29:03 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/opts Dec 6 05:29:03 localhost podman[265699]: 2025-12-06 10:29:03.650976779 +0000 UTC m=+0.065760725 container kill 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:03 localhost neutron_sriov_agent[212548]: 2025-12-06 10:29:03.702 2 INFO neutron.agent.securitygroups_rpc [None req-7ee9b6a2-b3c2-4056-9151-5c0b3406563c 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['1bb68f63-644a-4814-a6e4-fa4af7bea8f9']#033[00m Dec 6 05:29:03 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:03.974 219384 INFO neutron.agent.dhcp.agent [None req-5b7f94f4-5742-41ab-917c-136cbff42b5d - - - - - -] DHCP configuration for ports {'81e31da0-64d4-4c52-8eac-f2d271d9054a'} is completed#033[00m Dec 6 05:29:04 localhost nova_compute[237281]: 2025-12-06 10:29:04.417 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:06.713 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:29:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:06.713 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:29:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:06.714 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:29:06 localhost nova_compute[237281]: 2025-12-06 10:29:06.902 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:06 localhost nova_compute[237281]: 2025-12-06 10:29:06.932 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:07 localhost neutron_sriov_agent[212548]: 2025-12-06 10:29:07.406 2 INFO neutron.agent.securitygroups_rpc [None req-d69388ed-7f40-47f4-9a05-3065dd1bd087 934be94cb47043bdb7a9860f5b19d59e 1908a08bbbde496faaf3e03c38a8f437 - - default default] Security group rule updated ['576506ea-4210-415b-9a2d-513bfe8cf0c4']#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.568 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:29:07 localhost dnsmasq[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/addn_hosts - 0 addresses Dec 6 05:29:07 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/host Dec 6 05:29:07 localhost podman[265738]: 2025-12-06 10:29:07.57091817 +0000 UTC m=+0.064168138 container kill 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:29:07 localhost dnsmasq-dhcp[265629]: read /var/lib/neutron/dhcp/bc5b5961-46c8-445d-b480-1d7b7148ce85/opts Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.590 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.591 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:29:07 localhost kernel: device tapd04fddf3-50 left promiscuous mode Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.794 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:07 localhost ovn_controller[131684]: 2025-12-06T10:29:07Z|00482|binding|INFO|Releasing lport d04fddf3-500a-4566-a6b3-9ea8a88c0fd2 from this chassis (sb_readonly=0) Dec 6 05:29:07 localhost ovn_controller[131684]: 2025-12-06T10:29:07Z|00483|binding|INFO|Setting lport d04fddf3-500a-4566-a6b3-9ea8a88c0fd2 down in Southbound Dec 6 05:29:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:07.808 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-bc5b5961-46c8-445d-b480-1d7b7148ce85', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc5b5961-46c8-445d-b480-1d7b7148ce85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9333e497c33f48709838789e8bdc913e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5141b29f-e554-4e6b-b87e-1b0924fe6749, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d04fddf3-500a-4566-a6b3-9ea8a88c0fd2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:07.812 137259 INFO neutron.agent.ovn.metadata.agent [-] Port d04fddf3-500a-4566-a6b3-9ea8a88c0fd2 in datapath bc5b5961-46c8-445d-b480-1d7b7148ce85 unbound from our chassis#033[00m Dec 6 05:29:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:07.814 137259 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network bc5b5961-46c8-445d-b480-1d7b7148ce85 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.814 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:07 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:07.816 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[19a341ec-1859-42ea-83c9-a9e7158d6074]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.913 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.914 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.914 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.915 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:29:07 localhost nova_compute[237281]: 2025-12-06 10:29:07.981 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.058 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.060 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.141 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.142 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.217 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.218 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.295 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.520 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.522 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12262MB free_disk=387.2635688781738GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.523 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.523 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.604 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.605 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.606 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.671 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.697 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.699 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:29:08 localhost nova_compute[237281]: 2025-12-06 10:29:08.700 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:29:09 localhost nova_compute[237281]: 2025-12-06 10:29:09.419 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45995 DF PROTO=TCP SPT=51974 DPT=9102 SEQ=3472360902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE393870000000001030307) Dec 6 05:29:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:29:11 localhost systemd[1]: tmp-crun.WoW1sO.mount: Deactivated successfully. Dec 6 05:29:11 localhost podman[265774]: 2025-12-06 10:29:11.572833645 +0000 UTC m=+0.100315201 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:11 localhost podman[265774]: 2025-12-06 10:29:11.620442791 +0000 UTC m=+0.147924337 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller) Dec 6 05:29:11 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:29:11 localhost nova_compute[237281]: 2025-12-06 10:29:11.938 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:12 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:12.409 219384 INFO neutron.agent.linux.ip_lib [None req-60d03f32-93bf-4968-b805-a4f8772dc107 - - - - - -] Device tap8d9614af-4a cannot be used as it has no MAC address#033[00m Dec 6 05:29:12 localhost nova_compute[237281]: 2025-12-06 10:29:12.442 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:12 localhost kernel: device tap8d9614af-4a entered promiscuous mode Dec 6 05:29:12 localhost NetworkManager[5965]: [1765016952.4505] manager: (tap8d9614af-4a): new Generic device (/org/freedesktop/NetworkManager/Devices/79) Dec 6 05:29:12 localhost ovn_controller[131684]: 2025-12-06T10:29:12Z|00484|binding|INFO|Claiming lport 8d9614af-4abc-4727-8f7b-29d90ea6ec4f for this chassis. Dec 6 05:29:12 localhost ovn_controller[131684]: 2025-12-06T10:29:12Z|00485|binding|INFO|8d9614af-4abc-4727-8f7b-29d90ea6ec4f: Claiming unknown Dec 6 05:29:12 localhost nova_compute[237281]: 2025-12-06 10:29:12.454 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:12 localhost systemd-udevd[265809]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:29:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:12.470 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-76a07b73-5c92-4b0e-a73b-8e4b21a3db23', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a07b73-5c92-4b0e-a73b-8e4b21a3db23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f658b915-3e95-46b8-9df8-ba54f7f43399, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8d9614af-4abc-4727-8f7b-29d90ea6ec4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:12.473 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 8d9614af-4abc-4727-8f7b-29d90ea6ec4f in datapath 76a07b73-5c92-4b0e-a73b-8e4b21a3db23 bound to our chassis#033[00m Dec 6 05:29:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:12.476 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port ce4ef553-9652-4d1f-b3fe-162dab1021d9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:29:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:12.477 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76a07b73-5c92-4b0e-a73b-8e4b21a3db23, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:29:12 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:12.479 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[966767ec-0b60-4649-ae97-7ebc43267ef9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost ovn_controller[131684]: 2025-12-06T10:29:12Z|00486|binding|INFO|Setting lport 8d9614af-4abc-4727-8f7b-29d90ea6ec4f ovn-installed in OVS Dec 6 05:29:12 localhost ovn_controller[131684]: 2025-12-06T10:29:12Z|00487|binding|INFO|Setting lport 8d9614af-4abc-4727-8f7b-29d90ea6ec4f up in Southbound Dec 6 05:29:12 localhost nova_compute[237281]: 2025-12-06 10:29:12.498 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost journal[186952]: ethtool ioctl error on tap8d9614af-4a: No such device Dec 6 05:29:12 localhost nova_compute[237281]: 2025-12-06 10:29:12.551 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:12 localhost nova_compute[237281]: 2025-12-06 10:29:12.586 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:13 localhost podman[265880]: Dec 6 05:29:13 localhost podman[265880]: 2025-12-06 10:29:13.502997782 +0000 UTC m=+0.085501935 container create d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:13 localhost systemd[1]: Started libpod-conmon-d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424.scope. Dec 6 05:29:13 localhost podman[265880]: 2025-12-06 10:29:13.455145178 +0000 UTC m=+0.037649421 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:29:13 localhost systemd[1]: Started libcrun container. Dec 6 05:29:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75e1d325767c2f2561bac321c9deb40a674c430e509a686c751a75fc6d0d356e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:29:13 localhost podman[265880]: 2025-12-06 10:29:13.579709374 +0000 UTC m=+0.162213527 container init d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:29:13 localhost podman[265880]: 2025-12-06 10:29:13.589651681 +0000 UTC m=+0.172155834 container start d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:29:13 localhost dnsmasq[265899]: started, version 2.85 cachesize 150 Dec 6 05:29:13 localhost dnsmasq[265899]: DNS service limited to local subnets Dec 6 05:29:13 localhost dnsmasq[265899]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:29:13 localhost dnsmasq[265899]: warning: no upstream servers configured Dec 6 05:29:13 localhost dnsmasq-dhcp[265899]: DHCP, static leases only on 10.102.0.0, lease time 1d Dec 6 05:29:13 localhost dnsmasq[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/addn_hosts - 0 addresses Dec 6 05:29:13 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/host Dec 6 05:29:13 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/opts Dec 6 05:29:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:13.649 219384 INFO neutron.agent.dhcp.agent [None req-4e83f891-2c43-4907-ac93-7ab00a98b5a7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:29:12Z, description=, device_id=479e12eb-4b2d-43e0-917d-286d183c2043, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d6b9fa33-c430-4a21-b650-47dac39f0bdd, ip_allocation=immediate, mac_address=fa:16:3e:e2:76:d0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:29:07Z, description=, dns_domain=, id=76a07b73-5c92-4b0e-a73b-8e4b21a3db23, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-820848088, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61957, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2621, status=ACTIVE, subnets=['28beed75-9d1a-49aa-b53e-3392dbd11d51'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:10Z, vlan_transparent=None, network_id=76a07b73-5c92-4b0e-a73b-8e4b21a3db23, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2638, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:12Z on network 76a07b73-5c92-4b0e-a73b-8e4b21a3db23#033[00m Dec 6 05:29:13 localhost podman[265900]: 2025-12-06 10:29:13.682126209 +0000 UTC m=+0.080732788 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:29:13 localhost podman[265900]: 2025-12-06 10:29:13.72046298 +0000 UTC m=+0.119069479 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 6 05:29:13 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:29:13 localhost podman[265924]: 2025-12-06 10:29:13.78153148 +0000 UTC m=+0.069579193 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 6 05:29:13 localhost podman[265924]: 2025-12-06 10:29:13.793611873 +0000 UTC m=+0.081659616 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Dec 6 05:29:13 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:29:13 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:13.819 219384 INFO neutron.agent.dhcp.agent [None req-5cf49cc4-3ab7-4c79-9474-c1528ae14ac1 - - - - - -] DHCP configuration for ports {'63fdd71a-e5d8-4930-876c-3dc9bd9b48d8'} is completed#033[00m Dec 6 05:29:13 localhost podman[265958]: 2025-12-06 10:29:13.88151813 +0000 UTC m=+0.039143277 container kill d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:29:13 localhost dnsmasq[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/addn_hosts - 1 addresses Dec 6 05:29:13 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/host Dec 6 05:29:13 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/opts Dec 6 05:29:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:14.101 219384 INFO neutron.agent.dhcp.agent [None req-9eada0e1-2a30-4958-8593-ef1498e34a25 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:29:12Z, description=, device_id=479e12eb-4b2d-43e0-917d-286d183c2043, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d6b9fa33-c430-4a21-b650-47dac39f0bdd, ip_allocation=immediate, mac_address=fa:16:3e:e2:76:d0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:29:07Z, description=, dns_domain=, id=76a07b73-5c92-4b0e-a73b-8e4b21a3db23, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-820848088, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61957, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2621, status=ACTIVE, subnets=['28beed75-9d1a-49aa-b53e-3392dbd11d51'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:10Z, vlan_transparent=None, network_id=76a07b73-5c92-4b0e-a73b-8e4b21a3db23, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2638, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:12Z on network 76a07b73-5c92-4b0e-a73b-8e4b21a3db23#033[00m Dec 6 05:29:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:14.202 219384 INFO neutron.agent.dhcp.agent [None req-1ede9ad7-ef84-4f69-b410-e662638b34d6 - - - - - -] DHCP configuration for ports {'d6b9fa33-c430-4a21-b650-47dac39f0bdd'} is completed#033[00m Dec 6 05:29:14 localhost dnsmasq[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/addn_hosts - 1 addresses Dec 6 05:29:14 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/host Dec 6 05:29:14 localhost podman[265995]: 2025-12-06 10:29:14.350898617 +0000 UTC m=+0.058602646 container kill d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 6 05:29:14 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/opts Dec 6 05:29:14 localhost nova_compute[237281]: 2025-12-06 10:29:14.469 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:14 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:14.823 219384 INFO neutron.agent.dhcp.agent [None req-308915fe-0b49-4378-a524-1fea48796e11 - - - - - -] DHCP configuration for ports {'d6b9fa33-c430-4a21-b650-47dac39f0bdd'} is completed#033[00m Dec 6 05:29:16 localhost openstack_network_exporter[199751]: ERROR 10:29:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:29:16 localhost openstack_network_exporter[199751]: ERROR 10:29:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:29:16 localhost openstack_network_exporter[199751]: ERROR 10:29:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:29:16 localhost openstack_network_exporter[199751]: ERROR 10:29:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:29:16 localhost openstack_network_exporter[199751]: Dec 6 05:29:16 localhost openstack_network_exporter[199751]: ERROR 10:29:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:29:16 localhost openstack_network_exporter[199751]: Dec 6 05:29:16 localhost nova_compute[237281]: 2025-12-06 10:29:16.390 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:16.392 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:16 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:16.393 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:29:16 localhost nova_compute[237281]: 2025-12-06 10:29:16.941 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:29:18 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:18.920 219384 INFO neutron.agent.linux.ip_lib [None req-59d8f73f-2060-4f5d-9ba3-a9950b79c37d - - - - - -] Device tap71912c42-78 cannot be used as it has no MAC address#033[00m Dec 6 05:29:18 localhost podman[266019]: 2025-12-06 10:29:18.936142558 +0000 UTC m=+0.090143317 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Dec 6 05:29:18 localhost podman[266019]: 2025-12-06 10:29:18.988700897 +0000 UTC m=+0.142701566 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 6 05:29:18 localhost nova_compute[237281]: 2025-12-06 10:29:18.987 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:18 localhost kernel: device tap71912c42-78 entered promiscuous mode Dec 6 05:29:18 localhost NetworkManager[5965]: [1765016958.9972] manager: (tap71912c42-78): new Generic device (/org/freedesktop/NetworkManager/Devices/80) Dec 6 05:29:18 localhost ovn_controller[131684]: 2025-12-06T10:29:18Z|00488|binding|INFO|Claiming lport 71912c42-78fd-4710-afd3-fa35b6dd6afd for this chassis. Dec 6 05:29:18 localhost ovn_controller[131684]: 2025-12-06T10:29:18Z|00489|binding|INFO|71912c42-78fd-4710-afd3-fa35b6dd6afd: Claiming unknown Dec 6 05:29:18 localhost nova_compute[237281]: 2025-12-06 10:29:18.996 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:19 localhost systemd-udevd[266054]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:29:19 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:29:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:19.012 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b82a743d-18ca-41d0-8df3-ff162e1c936c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b82a743d-18ca-41d0-8df3-ff162e1c936c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff24fe1a-a11e-4610-9bae-fa72f51f9466, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=71912c42-78fd-4710-afd3-fa35b6dd6afd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:19.014 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 71912c42-78fd-4710-afd3-fa35b6dd6afd in datapath b82a743d-18ca-41d0-8df3-ff162e1c936c bound to our chassis#033[00m Dec 6 05:29:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:19.017 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port d5fffa07-4314-4073-ad65-0755c4043490 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:29:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:19.017 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b82a743d-18ca-41d0-8df3-ff162e1c936c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:29:19 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:19.019 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[375f32f1-5646-45ed-a2fd-07d0544e84c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost ovn_controller[131684]: 2025-12-06T10:29:19Z|00490|binding|INFO|Setting lport 71912c42-78fd-4710-afd3-fa35b6dd6afd ovn-installed in OVS Dec 6 05:29:19 localhost ovn_controller[131684]: 2025-12-06T10:29:19Z|00491|binding|INFO|Setting lport 71912c42-78fd-4710-afd3-fa35b6dd6afd up in Southbound Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost nova_compute[237281]: 2025-12-06 10:29:19.038 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost journal[186952]: ethtool ioctl error on tap71912c42-78: No such device Dec 6 05:29:19 localhost systemd[1]: tmp-crun.QxQXiP.mount: Deactivated successfully. Dec 6 05:29:19 localhost nova_compute[237281]: 2025-12-06 10:29:19.085 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:19 localhost podman[266018]: 2025-12-06 10:29:19.09043192 +0000 UTC m=+0.247815143 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 05:29:19 localhost podman[266018]: 2025-12-06 10:29:19.122130867 +0000 UTC m=+0.279514060 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Dec 6 05:29:19 localhost nova_compute[237281]: 2025-12-06 10:29:19.122 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:19 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:29:19 localhost nova_compute[237281]: 2025-12-06 10:29:19.470 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:20 localhost podman[266134]: Dec 6 05:29:20 localhost podman[266134]: 2025-12-06 10:29:20.089125189 +0000 UTC m=+0.096335828 container create 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:29:20 localhost systemd[1]: Started libpod-conmon-26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76.scope. Dec 6 05:29:20 localhost podman[266134]: 2025-12-06 10:29:20.045405462 +0000 UTC m=+0.052616121 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:29:20 localhost systemd[1]: Started libcrun container. Dec 6 05:29:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4f98e3f7ec1dfa28b407e91b3ab4bc8b47d4e3033e03af77da0523baf5ae30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:29:20 localhost podman[266134]: 2025-12-06 10:29:20.161731075 +0000 UTC m=+0.168941714 container init 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 6 05:29:20 localhost podman[266134]: 2025-12-06 10:29:20.171995382 +0000 UTC m=+0.179206011 container start 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true) Dec 6 05:29:20 localhost dnsmasq[266154]: started, version 2.85 cachesize 150 Dec 6 05:29:20 localhost dnsmasq[266154]: DNS service limited to local subnets Dec 6 05:29:20 localhost dnsmasq[266154]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:29:20 localhost dnsmasq[266154]: warning: no upstream servers configured Dec 6 05:29:20 localhost dnsmasq-dhcp[266154]: DHCP, static leases only on 10.103.0.0, lease time 1d Dec 6 05:29:20 localhost dnsmasq[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/addn_hosts - 0 addresses Dec 6 05:29:20 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/host Dec 6 05:29:20 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/opts Dec 6 05:29:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:20.226 219384 INFO neutron.agent.dhcp.agent [None req-5933e07a-aaec-445c-830f-7f2950af3f25 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:29:19Z, description=, device_id=479e12eb-4b2d-43e0-917d-286d183c2043, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=3e700aa9-959b-40a0-86f2-08c964e7c915, ip_allocation=immediate, mac_address=fa:16:3e:24:54:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:29:14Z, description=, dns_domain=, id=b82a743d-18ca-41d0-8df3-ff162e1c936c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2084155987, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62740, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2640, status=ACTIVE, subnets=['60884db6-e4d3-4e05-a629-3be7918ffc73'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:16Z, vlan_transparent=None, network_id=b82a743d-18ca-41d0-8df3-ff162e1c936c, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2646, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:19Z on network b82a743d-18ca-41d0-8df3-ff162e1c936c#033[00m Dec 6 05:29:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:20.302 219384 INFO neutron.agent.dhcp.agent [None req-ccde71f6-34eb-4ee7-9084-791098c9caad - - - - - -] DHCP configuration for ports {'a888093a-124e-40cf-8244-69f9fc61f87f'} is completed#033[00m Dec 6 05:29:20 localhost dnsmasq[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/addn_hosts - 1 addresses Dec 6 05:29:20 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/host Dec 6 05:29:20 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/opts Dec 6 05:29:20 localhost podman[266172]: 2025-12-06 10:29:20.452959765 +0000 UTC m=+0.061896537 container kill 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:29:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:20.770 219384 INFO neutron.agent.dhcp.agent [None req-bce1d064-2bdd-4573-87e5-1bb4dd6dab12 - - - - - -] DHCP configuration for ports {'3e700aa9-959b-40a0-86f2-08c964e7c915'} is completed#033[00m Dec 6 05:29:20 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:20.904 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:29:19Z, description=, device_id=479e12eb-4b2d-43e0-917d-286d183c2043, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=3e700aa9-959b-40a0-86f2-08c964e7c915, ip_allocation=immediate, mac_address=fa:16:3e:24:54:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:29:14Z, description=, dns_domain=, id=b82a743d-18ca-41d0-8df3-ff162e1c936c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2084155987, port_security_enabled=True, project_id=1d62107f03194685a0d4a3a8f59ce292, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62740, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2640, status=ACTIVE, subnets=['60884db6-e4d3-4e05-a629-3be7918ffc73'], tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:16Z, vlan_transparent=None, network_id=b82a743d-18ca-41d0-8df3-ff162e1c936c, port_security_enabled=False, project_id=1d62107f03194685a0d4a3a8f59ce292, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2646, status=DOWN, tags=[], tenant_id=1d62107f03194685a0d4a3a8f59ce292, updated_at=2025-12-06T10:29:19Z on network b82a743d-18ca-41d0-8df3-ff162e1c936c#033[00m Dec 6 05:29:21 localhost dnsmasq[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/addn_hosts - 1 addresses Dec 6 05:29:21 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/host Dec 6 05:29:21 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/opts Dec 6 05:29:21 localhost podman[266211]: 2025-12-06 10:29:21.132008119 +0000 UTC m=+0.066461698 container kill 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true) Dec 6 05:29:21 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:21.483 219384 INFO neutron.agent.dhcp.agent [None req-ddbd3272-482d-44d1-9704-590bbfebe0f9 - - - - - -] DHCP configuration for ports {'3e700aa9-959b-40a0-86f2-08c964e7c915'} is completed#033[00m Dec 6 05:29:21 localhost dnsmasq[265629]: exiting on receipt of SIGTERM Dec 6 05:29:21 localhost podman[266248]: 2025-12-06 10:29:21.549020863 +0000 UTC m=+0.067982605 container kill 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:29:21 localhost systemd[1]: libpod-041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2.scope: Deactivated successfully. Dec 6 05:29:21 localhost podman[266263]: 2025-12-06 10:29:21.639020545 +0000 UTC m=+0.064495928 container died 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:29:21 localhost systemd[1]: tmp-crun.zeqSve.mount: Deactivated successfully. Dec 6 05:29:21 localhost podman[266263]: 2025-12-06 10:29:21.699443476 +0000 UTC m=+0.124918849 container remove 041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc5b5961-46c8-445d-b480-1d7b7148ce85, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Dec 6 05:29:21 localhost systemd[1]: libpod-conmon-041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2.scope: Deactivated successfully. Dec 6 05:29:21 localhost nova_compute[237281]: 2025-12-06 10:29:21.974 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:22 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:22.024 219384 INFO neutron.agent.dhcp.agent [None req-48d7faac-26c7-4e03-9fc8-78432b03af25 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:22 localhost systemd[1]: var-lib-containers-storage-overlay-595aba37042eb23debf6f00dda1034fcdc563475c7442753de251341c1b9c487-merged.mount: Deactivated successfully. Dec 6 05:29:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-041bfbb598d5350e4bdadcc2c34eede92e8d27c64e32935bf1db5f94c73c37e2-userdata-shm.mount: Deactivated successfully. Dec 6 05:29:22 localhost systemd[1]: run-netns-qdhcp\x2dbc5b5961\x2d46c8\x2d445d\x2db480\x2d1d7b7148ce85.mount: Deactivated successfully. Dec 6 05:29:22 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:22.395 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:29:23 localhost podman[197801]: time="2025-12-06T10:29:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:29:23 localhost podman[197801]: @ - - [06/Dec/2025:10:29:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149621 "" "Go-http-client/1.1" Dec 6 05:29:23 localhost podman[197801]: @ - - [06/Dec/2025:10:29:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17393 "" "Go-http-client/1.1" Dec 6 05:29:23 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:23.437 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:23 localhost ovn_controller[131684]: 2025-12-06T10:29:23Z|00492|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:29:23 localhost nova_compute[237281]: 2025-12-06 10:29:23.942 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39586 DF PROTO=TCP SPT=43284 DPT=9102 SEQ=3546122672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE3CBAA0000000001030307) Dec 6 05:29:24 localhost nova_compute[237281]: 2025-12-06 10:29:24.513 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39587 DF PROTO=TCP SPT=43284 DPT=9102 SEQ=3546122672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE3CFC70000000001030307) Dec 6 05:29:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45996 DF PROTO=TCP SPT=51974 DPT=9102 SEQ=3472360902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE3D3880000000001030307) Dec 6 05:29:26 localhost dnsmasq[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/addn_hosts - 0 addresses Dec 6 05:29:26 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/host Dec 6 05:29:26 localhost podman[266307]: 2025-12-06 10:29:26.165037002 +0000 UTC m=+0.061198376 container kill 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:29:26 localhost dnsmasq-dhcp[266154]: read /var/lib/neutron/dhcp/b82a743d-18ca-41d0-8df3-ff162e1c936c/opts Dec 6 05:29:26 localhost nova_compute[237281]: 2025-12-06 10:29:26.357 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:26 localhost ovn_controller[131684]: 2025-12-06T10:29:26Z|00493|binding|INFO|Releasing lport 71912c42-78fd-4710-afd3-fa35b6dd6afd from this chassis (sb_readonly=0) Dec 6 05:29:26 localhost kernel: device tap71912c42-78 left promiscuous mode Dec 6 05:29:26 localhost ovn_controller[131684]: 2025-12-06T10:29:26Z|00494|binding|INFO|Setting lport 71912c42-78fd-4710-afd3-fa35b6dd6afd down in Southbound Dec 6 05:29:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:26.375 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b82a743d-18ca-41d0-8df3-ff162e1c936c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b82a743d-18ca-41d0-8df3-ff162e1c936c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff24fe1a-a11e-4610-9bae-fa72f51f9466, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=71912c42-78fd-4710-afd3-fa35b6dd6afd) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:26.377 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 71912c42-78fd-4710-afd3-fa35b6dd6afd in datapath b82a743d-18ca-41d0-8df3-ff162e1c936c unbound from our chassis#033[00m Dec 6 05:29:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:26.381 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b82a743d-18ca-41d0-8df3-ff162e1c936c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:29:26 localhost nova_compute[237281]: 2025-12-06 10:29:26.382 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:26 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:26.382 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1c2820-390c-4692-a51f-7df19d86e3d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:29:27 localhost nova_compute[237281]: 2025-12-06 10:29:27.018 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39588 DF PROTO=TCP SPT=43284 DPT=9102 SEQ=3546122672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE3D7C70000000001030307) Dec 6 05:29:27 localhost dnsmasq[266154]: exiting on receipt of SIGTERM Dec 6 05:29:27 localhost podman[266346]: 2025-12-06 10:29:27.525514763 +0000 UTC m=+0.060640838 container kill 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Dec 6 05:29:27 localhost systemd[1]: libpod-26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76.scope: Deactivated successfully. Dec 6 05:29:27 localhost podman[266359]: 2025-12-06 10:29:27.584716447 +0000 UTC m=+0.049451185 container died 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:29:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76-userdata-shm.mount: Deactivated successfully. Dec 6 05:29:27 localhost podman[266359]: 2025-12-06 10:29:27.613696269 +0000 UTC m=+0.078430947 container cleanup 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:29:27 localhost systemd[1]: libpod-conmon-26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76.scope: Deactivated successfully. Dec 6 05:29:27 localhost podman[266365]: 2025-12-06 10:29:27.658466868 +0000 UTC m=+0.110353359 container remove 26e842e5b172dc137335d16a0530550094dc6679c17e786e3f048c333fab8d76 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b82a743d-18ca-41d0-8df3-ff162e1c936c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:29:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:27.692 219384 INFO neutron.agent.dhcp.agent [None req-621d35a5-1672-446a-8609-236c5fbcb4e7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:27 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:27.970 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4075 DF PROTO=TCP SPT=46158 DPT=9102 SEQ=399057658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE3DB870000000001030307) Dec 6 05:29:28 localhost ovn_controller[131684]: 2025-12-06T10:29:28Z|00495|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:29:28 localhost nova_compute[237281]: 2025-12-06 10:29:28.193 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:29:28 localhost systemd[1]: var-lib-containers-storage-overlay-7a4f98e3f7ec1dfa28b407e91b3ab4bc8b47d4e3033e03af77da0523baf5ae30-merged.mount: Deactivated successfully. Dec 6 05:29:28 localhost systemd[1]: run-netns-qdhcp\x2db82a743d\x2d18ca\x2d41d0\x2d8df3\x2dff162e1c936c.mount: Deactivated successfully. Dec 6 05:29:28 localhost podman[266389]: 2025-12-06 10:29:28.551031328 +0000 UTC m=+0.082739419 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container) Dec 6 05:29:28 localhost podman[266389]: 2025-12-06 10:29:28.570315022 +0000 UTC m=+0.102023133 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.) Dec 6 05:29:28 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:29:29 localhost dnsmasq[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/addn_hosts - 0 addresses Dec 6 05:29:29 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/host Dec 6 05:29:29 localhost podman[266425]: 2025-12-06 10:29:29.416940477 +0000 UTC m=+0.050891939 container kill d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS) Dec 6 05:29:29 localhost dnsmasq-dhcp[265899]: read /var/lib/neutron/dhcp/76a07b73-5c92-4b0e-a73b-8e4b21a3db23/opts Dec 6 05:29:29 localhost nova_compute[237281]: 2025-12-06 10:29:29.515 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:29 localhost ovn_controller[131684]: 2025-12-06T10:29:29Z|00496|binding|INFO|Releasing lport 8d9614af-4abc-4727-8f7b-29d90ea6ec4f from this chassis (sb_readonly=0) Dec 6 05:29:29 localhost nova_compute[237281]: 2025-12-06 10:29:29.586 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:29 localhost ovn_controller[131684]: 2025-12-06T10:29:29Z|00497|binding|INFO|Setting lport 8d9614af-4abc-4727-8f7b-29d90ea6ec4f down in Southbound Dec 6 05:29:29 localhost kernel: device tap8d9614af-4a left promiscuous mode Dec 6 05:29:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:29.600 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-76a07b73-5c92-4b0e-a73b-8e4b21a3db23', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a07b73-5c92-4b0e-a73b-8e4b21a3db23', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f658b915-3e95-46b8-9df8-ba54f7f43399, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8d9614af-4abc-4727-8f7b-29d90ea6ec4f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:29.603 137259 INFO neutron.agent.ovn.metadata.agent [-] Port 8d9614af-4abc-4727-8f7b-29d90ea6ec4f in datapath 76a07b73-5c92-4b0e-a73b-8e4b21a3db23 unbound from our chassis#033[00m Dec 6 05:29:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:29.606 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76a07b73-5c92-4b0e-a73b-8e4b21a3db23, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:29:29 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:29.607 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[820ddd5b-ea73-4220-b975-07a9e7f10ed4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:29:29 localhost nova_compute[237281]: 2025-12-06 10:29:29.610 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:30 localhost dnsmasq[265899]: exiting on receipt of SIGTERM Dec 6 05:29:30 localhost podman[266465]: 2025-12-06 10:29:30.40896871 +0000 UTC m=+0.059707991 container kill d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 6 05:29:30 localhost systemd[1]: libpod-d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424.scope: Deactivated successfully. Dec 6 05:29:30 localhost podman[266480]: 2025-12-06 10:29:30.488248172 +0000 UTC m=+0.058312868 container died d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 6 05:29:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424-userdata-shm.mount: Deactivated successfully. Dec 6 05:29:30 localhost systemd[1]: var-lib-containers-storage-overlay-75e1d325767c2f2561bac321c9deb40a674c430e509a686c751a75fc6d0d356e-merged.mount: Deactivated successfully. Dec 6 05:29:30 localhost podman[266480]: 2025-12-06 10:29:30.548079544 +0000 UTC m=+0.118144220 container remove d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a07b73-5c92-4b0e-a73b-8e4b21a3db23, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 6 05:29:30 localhost systemd[1]: libpod-conmon-d857ada2b19805f1b013e6dd81c4e34326e560d4ad11ba9b802c6bd9a6c39424.scope: Deactivated successfully. Dec 6 05:29:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:30.595 219384 INFO neutron.agent.dhcp.agent [None req-8d50a983-641c-4775-bdfb-6bd389394276 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:30 localhost systemd[1]: run-netns-qdhcp\x2d76a07b73\x2d5c92\x2d4b0e\x2da73b\x2d8e4b21a3db23.mount: Deactivated successfully. Dec 6 05:29:30 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:30.738 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:30 localhost ovn_controller[131684]: 2025-12-06T10:29:30Z|00498|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:29:30 localhost nova_compute[237281]: 2025-12-06 10:29:30.940 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39589 DF PROTO=TCP SPT=43284 DPT=9102 SEQ=3546122672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE3E7870000000001030307) Dec 6 05:29:32 localhost nova_compute[237281]: 2025-12-06 10:29:32.022 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:32 localhost sshd[266507]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:29:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:29:33 localhost podman[266509]: 2025-12-06 10:29:33.246653097 +0000 UTC m=+0.080930023 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:29:33 localhost podman[266509]: 2025-12-06 10:29:33.28343319 +0000 UTC m=+0.117710186 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:29:33 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:29:34 localhost ovn_controller[131684]: 2025-12-06T10:29:34Z|00499|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:29:34 localhost nova_compute[237281]: 2025-12-06 10:29:34.216 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:34 localhost nova_compute[237281]: 2025-12-06 10:29:34.516 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:35 localhost dnsmasq[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/addn_hosts - 0 addresses Dec 6 05:29:35 localhost podman[266550]: 2025-12-06 10:29:35.307888982 +0000 UTC m=+0.073205446 container kill c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 6 05:29:35 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/host Dec 6 05:29:35 localhost dnsmasq-dhcp[265439]: read /var/lib/neutron/dhcp/3898807e-38ab-4604-999c-d6177473b70f/opts Dec 6 05:29:35 localhost systemd[1]: tmp-crun.RlbxC6.mount: Deactivated successfully. Dec 6 05:29:35 localhost ovn_controller[131684]: 2025-12-06T10:29:35Z|00500|binding|INFO|Releasing lport df76bb28-3f42-4c01-a3f9-c8ec6a089f9c from this chassis (sb_readonly=0) Dec 6 05:29:35 localhost kernel: device tapdf76bb28-3f left promiscuous mode Dec 6 05:29:35 localhost ovn_controller[131684]: 2025-12-06T10:29:35Z|00501|binding|INFO|Setting lport df76bb28-3f42-4c01-a3f9-c8ec6a089f9c down in Southbound Dec 6 05:29:35 localhost nova_compute[237281]: 2025-12-06 10:29:35.530 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:35.545 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-3898807e-38ab-4604-999c-d6177473b70f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3898807e-38ab-4604-999c-d6177473b70f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d62107f03194685a0d4a3a8f59ce292', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=813fe017-af6c-4692-aae9-aebb5bb816e6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=df76bb28-3f42-4c01-a3f9-c8ec6a089f9c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:29:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:35.547 137259 INFO neutron.agent.ovn.metadata.agent [-] Port df76bb28-3f42-4c01-a3f9-c8ec6a089f9c in datapath 3898807e-38ab-4604-999c-d6177473b70f unbound from our chassis#033[00m Dec 6 05:29:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:35.551 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3898807e-38ab-4604-999c-d6177473b70f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:29:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:29:35.552 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[5f20c61e-8ddf-4f96-b3ec-172a48b383fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:29:35 localhost nova_compute[237281]: 2025-12-06 10:29:35.559 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:36 localhost dnsmasq[265439]: exiting on receipt of SIGTERM Dec 6 05:29:36 localhost podman[266587]: 2025-12-06 10:29:36.184434848 +0000 UTC m=+0.060869045 container kill c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:36 localhost systemd[1]: libpod-c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17.scope: Deactivated successfully. Dec 6 05:29:36 localhost podman[266603]: 2025-12-06 10:29:36.257327893 +0000 UTC m=+0.053591162 container died c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:29:36 localhost podman[266603]: 2025-12-06 10:29:36.288687249 +0000 UTC m=+0.084950478 container cleanup c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:36 localhost systemd[1]: libpod-conmon-c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17.scope: Deactivated successfully. Dec 6 05:29:36 localhost systemd[1]: var-lib-containers-storage-overlay-332ab12730bb03ff051e2f43f31c45cb478d4f648adc67ab0144cea3e538e0fc-merged.mount: Deactivated successfully. Dec 6 05:29:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17-userdata-shm.mount: Deactivated successfully. Dec 6 05:29:36 localhost podman[266604]: 2025-12-06 10:29:36.33611723 +0000 UTC m=+0.129125479 container remove c72d2d1e4bbf7ba40429f7b0f4a4297b904d06fd1d9d6d7c1e16545c5ddf6e17 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3898807e-38ab-4604-999c-d6177473b70f, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:29:36 localhost systemd[1]: run-netns-qdhcp\x2d3898807e\x2d38ab\x2d4604\x2d999c\x2dd6177473b70f.mount: Deactivated successfully. Dec 6 05:29:36 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:36.570 219384 INFO neutron.agent.dhcp.agent [None req-1615ecab-c5ac-475f-aacf-438c4ec14f0e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:36 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:36.571 219384 INFO neutron.agent.dhcp.agent [None req-1615ecab-c5ac-475f-aacf-438c4ec14f0e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:36 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:29:36.654 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:29:36 localhost ovn_controller[131684]: 2025-12-06T10:29:36Z|00502|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:29:36 localhost nova_compute[237281]: 2025-12-06 10:29:36.885 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:37 localhost nova_compute[237281]: 2025-12-06 10:29:37.026 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39590 DF PROTO=TCP SPT=43284 DPT=9102 SEQ=3546122672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE407880000000001030307) Dec 6 05:29:39 localhost nova_compute[237281]: 2025-12-06 10:29:39.520 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:42 localhost nova_compute[237281]: 2025-12-06 10:29:42.063 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:29:42 localhost podman[266633]: 2025-12-06 10:29:42.540980732 +0000 UTC m=+0.072777482 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Dec 6 05:29:42 localhost podman[266633]: 2025-12-06 10:29:42.581013626 +0000 UTC m=+0.112810336 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:29:42 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:29:44 localhost nova_compute[237281]: 2025-12-06 10:29:44.550 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:44 localhost systemd[1]: tmp-crun.quU9il.mount: Deactivated successfully. Dec 6 05:29:44 localhost podman[266660]: 2025-12-06 10:29:44.563751953 +0000 UTC m=+0.093827051 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:29:44 localhost podman[266660]: 2025-12-06 10:29:44.57115013 +0000 UTC m=+0.101225228 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:29:44 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:29:44 localhost podman[266661]: 2025-12-06 10:29:44.653014841 +0000 UTC m=+0.179975443 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:29:44 localhost podman[266661]: 2025-12-06 10:29:44.667158817 +0000 UTC m=+0.194119399 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd) Dec 6 05:29:44 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:29:46 localhost openstack_network_exporter[199751]: ERROR 10:29:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:29:46 localhost openstack_network_exporter[199751]: ERROR 10:29:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:29:46 localhost openstack_network_exporter[199751]: ERROR 10:29:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:29:46 localhost openstack_network_exporter[199751]: ERROR 10:29:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:29:46 localhost openstack_network_exporter[199751]: Dec 6 05:29:46 localhost openstack_network_exporter[199751]: ERROR 10:29:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:29:46 localhost openstack_network_exporter[199751]: Dec 6 05:29:47 localhost nova_compute[237281]: 2025-12-06 10:29:47.071 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:29:49 localhost podman[266703]: 2025-12-06 10:29:49.551403907 +0000 UTC m=+0.086593308 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Dec 6 05:29:49 localhost nova_compute[237281]: 2025-12-06 10:29:49.553 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:49 localhost podman[266703]: 2025-12-06 10:29:49.587437977 +0000 UTC m=+0.122627358 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent) Dec 6 05:29:49 localhost podman[266704]: 2025-12-06 10:29:49.600556291 +0000 UTC m=+0.131496441 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 6 05:29:49 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:29:49 localhost podman[266704]: 2025-12-06 10:29:49.640012487 +0000 UTC m=+0.170952597 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Dec 6 05:29:49 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:29:52 localhost nova_compute[237281]: 2025-12-06 10:29:52.115 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:53 localhost podman[197801]: time="2025-12-06T10:29:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:29:53 localhost podman[197801]: @ - - [06/Dec/2025:10:29:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:29:53 localhost podman[197801]: @ - - [06/Dec/2025:10:29:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15972 "" "Go-http-client/1.1" Dec 6 05:29:53 localhost nova_compute[237281]: 2025-12-06 10:29:53.701 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52344 DF PROTO=TCP SPT=38628 DPT=9102 SEQ=31544402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE440DA0000000001030307) Dec 6 05:29:54 localhost nova_compute[237281]: 2025-12-06 10:29:54.556 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52345 DF PROTO=TCP SPT=38628 DPT=9102 SEQ=31544402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE444C70000000001030307) Dec 6 05:29:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39591 DF PROTO=TCP SPT=43284 DPT=9102 SEQ=3546122672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE447870000000001030307) Dec 6 05:29:55 localhost nova_compute[237281]: 2025-12-06 10:29:55.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52346 DF PROTO=TCP SPT=38628 DPT=9102 SEQ=31544402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE44CC80000000001030307) Dec 6 05:29:57 localhost nova_compute[237281]: 2025-12-06 10:29:57.159 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:57 localhost nova_compute[237281]: 2025-12-06 10:29:57.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45997 DF PROTO=TCP SPT=51974 DPT=9102 SEQ=3472360902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE451870000000001030307) Dec 6 05:29:58 localhost nova_compute[237281]: 2025-12-06 10:29:58.881 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:29:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:29:59 localhost nova_compute[237281]: 2025-12-06 10:29:59.559 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:29:59 localhost podman[266741]: 2025-12-06 10:29:59.56603519 +0000 UTC m=+0.099871557 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible) Dec 6 05:29:59 localhost podman[266741]: 2025-12-06 10:29:59.602639938 +0000 UTC m=+0.136476295 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 6 05:29:59 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:30:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52347 DF PROTO=TCP SPT=38628 DPT=9102 SEQ=31544402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE45C880000000001030307) Dec 6 05:30:01 localhost nova_compute[237281]: 2025-12-06 10:30:01.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:01 localhost nova_compute[237281]: 2025-12-06 10:30:01.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:01 localhost nova_compute[237281]: 2025-12-06 10:30:01.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:30:02 localhost nova_compute[237281]: 2025-12-06 10:30:02.208 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:02 localhost nova_compute[237281]: 2025-12-06 10:30:02.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:30:03 localhost podman[266762]: 2025-12-06 10:30:03.553856782 +0000 UTC m=+0.081638415 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:30:03 localhost podman[266762]: 2025-12-06 10:30:03.590580153 +0000 UTC m=+0.118361806 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 6 05:30:03 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:30:03 localhost nova_compute[237281]: 2025-12-06 10:30:03.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:03 localhost nova_compute[237281]: 2025-12-06 10:30:03.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:30:03 localhost nova_compute[237281]: 2025-12-06 10:30:03.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:30:04 localhost nova_compute[237281]: 2025-12-06 10:30:04.595 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:04 localhost nova_compute[237281]: 2025-12-06 10:30:04.901 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:30:04 localhost nova_compute[237281]: 2025-12-06 10:30:04.902 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:30:04 localhost nova_compute[237281]: 2025-12-06 10:30:04.902 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:30:04 localhost nova_compute[237281]: 2025-12-06 10:30:04.903 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:30:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:06.714 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:30:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:06.714 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:30:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:06.716 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:30:07 localhost nova_compute[237281]: 2025-12-06 10:30:07.244 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.603 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.624 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.625 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.626 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.647 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.648 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.649 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.649 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.718 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.794 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.795 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.873 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.875 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.935 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.936 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:30:08 localhost nova_compute[237281]: 2025-12-06 10:30:08.980 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.212 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.214 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12252MB free_disk=387.26367950439453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.215 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.216 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.290 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.291 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.291 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.358 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.373 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.376 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.376 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:30:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52348 DF PROTO=TCP SPT=38628 DPT=9102 SEQ=31544402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE47D870000000001030307) Dec 6 05:30:09 localhost nova_compute[237281]: 2025-12-06 10:30:09.599 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:11 localhost ovn_controller[131684]: 2025-12-06T10:30:11Z|00503|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory Dec 6 05:30:12 localhost nova_compute[237281]: 2025-12-06 10:30:12.248 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:30:13 localhost podman[266798]: 2025-12-06 10:30:13.558933289 +0000 UTC m=+0.090146247 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 6 05:30:13 localhost podman[266798]: 2025-12-06 10:30:13.62552394 +0000 UTC m=+0.156736948 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 05:30:13 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:30:14 localhost nova_compute[237281]: 2025-12-06 10:30:14.640 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:30:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:30:15 localhost podman[266823]: 2025-12-06 10:30:15.554021145 +0000 UTC m=+0.090222710 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:30:15 localhost podman[266823]: 2025-12-06 10:30:15.560529505 +0000 UTC m=+0.096731060 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:30:15 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:30:15 localhost podman[266824]: 2025-12-06 10:30:15.611566768 +0000 UTC m=+0.146400811 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 6 05:30:15 localhost podman[266824]: 2025-12-06 10:30:15.625282319 +0000 UTC m=+0.160116322 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd) Dec 6 05:30:15 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:30:16 localhost openstack_network_exporter[199751]: ERROR 10:30:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:30:16 localhost openstack_network_exporter[199751]: ERROR 10:30:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:30:16 localhost openstack_network_exporter[199751]: ERROR 10:30:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:30:16 localhost openstack_network_exporter[199751]: ERROR 10:30:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:30:16 localhost openstack_network_exporter[199751]: Dec 6 05:30:16 localhost openstack_network_exporter[199751]: ERROR 10:30:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:30:16 localhost openstack_network_exporter[199751]: Dec 6 05:30:17 localhost nova_compute[237281]: 2025-12-06 10:30:17.293 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:19 localhost nova_compute[237281]: 2025-12-06 10:30:19.644 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:30:20 localhost podman[266864]: 2025-12-06 10:30:20.552911316 +0000 UTC m=+0.082265584 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:30:20 localhost podman[266864]: 2025-12-06 10:30:20.560415968 +0000 UTC m=+0.089770216 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 05:30:20 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:30:20 localhost podman[266863]: 2025-12-06 10:30:20.603773514 +0000 UTC m=+0.135652600 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:30:20 localhost podman[266863]: 2025-12-06 10:30:20.639474563 +0000 UTC m=+0.171353649 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 6 05:30:20 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:30:22 localhost nova_compute[237281]: 2025-12-06 10:30:22.339 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:22.997 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'name': 'test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548798.ooo.test', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '47835b89168945138751a4b216280589', 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'hostId': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 6 05:30:22 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:22.998 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.023 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a4322e8-9910-4ab2-9db4-bd7beb3ec2ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:30:22.998723', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '924db076-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.233480355, 'message_signature': '52cd38559a1d765241629002b478d26ced5805d53763328b475decdc1f365f4e'}]}, 'timestamp': '2025-12-06 10:30:23.024252', '_unique_id': '9f07ded965554e63ab4aaaa60f18e469'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.026 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.031 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e330d659-bd9d-43dd-8ac3-fc5781062959', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.028175', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '924ee72a-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': '8e389531b2daf22c9045918287f19fb9db122d7d91a493e89ff21249c7b1b850'}]}, 'timestamp': '2025-12-06 10:30:23.032169', '_unique_id': '212aeb951e374e6ab87ab336866d85d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.033 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.034 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.079 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.080 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a247962-12cc-468e-bcbe-12cc757a1114', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.034647', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '925649ca-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '0e65db7f05f890a8e37bf412c636a4b10f459a1b8f94f4775801c7e36c44ce06'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.034647', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '925662de-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': 'c885f5e308680ae3df74bea784e07ff8c7daba6a1fc7925f397ac66cd3e56ad7'}]}, 'timestamp': '2025-12-06 10:30:23.081177', '_unique_id': '07beb2a79d934efc879dada6bf8fcbb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.083 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.084 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 281376365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.085 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.latency volume: 20108489 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b861cfc5-7453-45f4-b24c-3733986b82ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 281376365, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.084639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9256feba-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': 'd1ffbece21751d163362197d82ea6785cf993c0c96e3d7c9689dce53608057f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20108489, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.084639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '92571102-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '9ca513994a08681ebf557ec7d66c7284cbf7049bae1f55199928fedf599506e5'}]}, 'timestamp': '2025-12-06 10:30:23.085607', '_unique_id': '4b11f894592d466aa4858750de43d96a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.086 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.088 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 46716107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.089 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.latency volume: 187866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fea1b964-c6a2-4e0f-a9c9-20ac69d9d31a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46716107, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.088426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '925796cc-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '63c0655d4e9344da1b4c46a20d853203d614897de93b5d7a92a9bae5a5539d31'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 187866, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.088426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9257b1e8-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '0ca1a3a4cfa71a3b024fd5608ff602798cc6168ed3bd22561c1e62babc2e45a6'}]}, 'timestamp': '2025-12-06 10:30:23.089891', '_unique_id': 'ce054a0b13264e85a7dc699a0c3ad360'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.091 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.093 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.093 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.094 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1182f194-0225-4d69-80c5-9dc6a8deda6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.093579', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9258612e-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '9b58a604b8d5d6450b0b880d87589ead6bd01fde76b731d265ff50adf74bdbed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.093579', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '92587aec-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '446c0ba2e88d38a6a8627d914b62d24ff08980deeb7d5516600982c0985c42ac'}]}, 'timestamp': '2025-12-06 10:30:23.094911', '_unique_id': '84763c013c40437b968b645d4c338b53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.095 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.097 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daaf08be-6be0-4645-b668-39fa28b929df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.097460', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9258f332-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': '67e0b6605c3a7f1b9f58e8ef5de1516e209e86006f89b2cfa712c867df837ea2'}]}, 'timestamp': '2025-12-06 10:30:23.098014', '_unique_id': '335b9f350df74511b278f7965e3e511e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.099 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.100 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2f15234-3b53-4728-9f55-644fdee6f9dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.100301', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '925961fa-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': '458133c2fef044aa606a2d98849979c5c0ae88bf89fd8d7dbac7b5f360791ad5'}]}, 'timestamp': '2025-12-06 10:30:23.100825', '_unique_id': '51e37a11598d463698096609ff5e6d67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.101 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.121 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 31662080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.122 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.allocation volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ff44f4b-a75f-4168-a1b2-f283be93630f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31662080, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.103200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '925cad4c-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.313366126, 'message_signature': '8feb9ce99d4868fce7d8abfc5e0f81c3cbf5c57e5e94779ad17b4a9c3a5c9932'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.103200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '925cc0fc-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.313366126, 'message_signature': '78321c64a06e107979b4a8d113f905bd71916cfcb6fe7aea9ecb9f632d5fb50d'}]}, 'timestamp': '2025-12-06 10:30:23.122915', '_unique_id': '5a6ddbecf33b4de4a54878d75f951bef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.124 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.125 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '853bbea0-fe4f-42e5-a786-906d38d18d1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.125731', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '925d441e-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': '41b9f552a81a46866bc8a4667257ae625d418b943ababf7b029ae0ff5022dbc1'}]}, 'timestamp': '2025-12-06 10:30:23.126259', '_unique_id': '051506f956dc43ac9b1f8c29d4013a70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.127 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.128 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6f7b5dc-77e6-4b95-9787-289875a83c17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.128559', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '925db098-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': 'a61c35bcea64a33d7bd14527ac5a47dd0ef68865f54361e657825d8c402c81f6'}]}, 'timestamp': '2025-12-06 10:30:23.129064', '_unique_id': 'c6738e9cc9b84a0bb6585e8e9893859a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.130 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.131 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f800488c-2822-463e-84be-0a2430ef6608', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.131458', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '925e2226-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': '4d81cedcd122decc074655177474a2f49086faded92b14cb2caf8b0c587b7464'}]}, 'timestamp': '2025-12-06 10:30:23.131970', '_unique_id': '240fdd346e4c45249214a132cf22fb38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.132 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 31260672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.134 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.usage volume: 393216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99d95f05-40c2-4829-98a8-bea8fe90c51c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31260672, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.134379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '925e93a0-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.313366126, 'message_signature': '65773ac9afff15dcc28de0a806fdf9286d5ca8a48820c3caf2aac78d8048e002'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 393216, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.134379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '925ea6d8-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.313366126, 'message_signature': '62bd404b25ea1e44baafa0fb005d08dd2cfaf66734b68b757bb7c5dfc043deb9'}]}, 'timestamp': '2025-12-06 10:30:23.135335', '_unique_id': 'cda056c3aaf442399205d82c9c0860a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.136 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.137 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.incoming.bytes volume: 6815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd44cbab-9a68-467f-b88a-ca9de2a61290', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6815, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.137740', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '925f18b6-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': '0e825694b9aeb09de7cd3787c23f3b545ff45c214e0723b70353ab1683e17822'}]}, 'timestamp': '2025-12-06 10:30:23.138252', '_unique_id': '3b548667d69340e2b0ea81d927d90aa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.139 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.140 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.140 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/cpu volume: 21980000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39d653db-d4f3-40da-84b9-c958c75a3c96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21980000000, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'timestamp': '2025-12-06T10:30:23.140492', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '925f82b0-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.233480355, 'message_signature': '8eafbd3c1f85feb7644f7a5c022ecdd181b2dc7a165fb1f42af3b5e86056e65a'}]}, 'timestamp': '2025-12-06 10:30:23.140983', '_unique_id': 'be30b02e62f541b4b8c97a7524b4c98c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.141 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.143 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.143 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd575e488-f977-442e-8b91-bdcd55aba34a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.143320', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '925ff128-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.313366126, 'message_signature': 'cea19d3662fe6c17e1e5c71bc1073e9cb36a705cad6e66126764190eab8071d9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.143320', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '92600a82-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.313366126, 'message_signature': '1f2b8deb7a9673526e087641a60214ef0c8c35c9cb56d4801780b5192fa3277c'}]}, 'timestamp': '2025-12-06 10:30:23.144441', '_unique_id': '9c367c95f026451d871a1747e6edb4b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.145 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.147 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.147 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9b9aeaa-ec0e-40eb-a7ec-1800c4e593e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.147001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9260816a-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '304619e4d14ec3af4d4142b563947149bccdf38496f324967cb2cd2a225e8523'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.147001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '926091e6-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '478a1e6da3d5eb74883d7f09021a2072bb3752ab39f0e9e6814716a002c687dc'}]}, 'timestamp': '2025-12-06 10:30:23.147905', '_unique_id': '60e52ce94e3642d48c0116ca906c7f62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.148 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.150 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd79bea66-3ae2-414e-bc32-93dcee5a6166', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.150168', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '9260fcda-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': 'a03a5c8acddf4177523b57129aa42b634900e15196d282d83a0f89cab58035ac'}]}, 'timestamp': '2025-12-06 10:30:23.150642', '_unique_id': '7a6062ee69784c5ebb5995c7b70f36b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.151 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.153 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cb39ce9-39c0-4c8f-be09-e2ed26c42616', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.153775', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '92619082-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': '209094d766b00fddcfabb12444c9f00b7f8f4f2f9374cb4642a55793c4903bd6'}]}, 'timestamp': '2025-12-06 10:30:23.154538', '_unique_id': 'b61aacffe15947029434b4d1d1c46a06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.155 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.156 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.157 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a4b70fe-5c10-44d1-97ba-9ae978da2913', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vda', 'timestamp': '2025-12-06T10:30:23.156871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '926201d4-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': 'b289dbfb58583766ed199f7e828203aa4ab6106dd9d65390ef6c4709c988cae5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93-vdb', 'timestamp': '2025-12-06T10:30:23.156871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '92621142-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.244759653, 'message_signature': '12912cb9090f378bc46a9f41e851f3fb76e8eb6886a5402f0edc78e9baf0ac12'}]}, 'timestamp': '2025-12-06 10:30:23.157677', '_unique_id': '2f3b1294c0724af58788bb862f37f4c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.158 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.159 12 DEBUG ceilometer.compute.pollsters [-] a5070ada-6b60-4992-a1bf-9e83aaccac93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7da45de6-49f7-4652-95fa-62f28172f5a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5220ceda9e4145d395f52fc9fd0365c0', 'user_name': None, 'project_id': '47835b89168945138751a4b216280589', 'project_name': None, 'resource_id': 'instance-00000002-a5070ada-6b60-4992-a1bf-9e83aaccac93-tap227fe5b2-a5', 'timestamp': '2025-12-06T10:30:23.159414', 'resource_metadata': {'display_name': 'test', 'name': 'tap227fe5b2-a5', 'instance_id': 'a5070ada-6b60-4992-a1bf-9e83aaccac93', 'instance_type': 'm1.small', 'host': '9037fb5f6bf6ffc42a9b2aad2ac2d1d35086853bf68867fd15928ff9', 'instance_host': 'np0005548798.ooo.test', 'flavor': {'id': '7a18e612-6562-4812-b07b-d906254f72f4', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'c6562616-bf77-48e6-bb05-431e64af083a'}, 'image_ref': 'c6562616-bf77-48e6-bb05-431e64af083a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:91:02:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap227fe5b2-a5'}, 'message_id': '92626340-d28e-11f0-8fed-fa163edf398d', 'monotonic_time': 13410.238301394, 'message_signature': 'f6396838cef7418c350fb8241c1b26b26198ed54291b39206be770b86bd537de'}]}, 'timestamp': '2025-12-06 10:30:23.159782', '_unique_id': 'ad05da33163c4ccba7d9422ff81ce5a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging yield Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.160 12 ERROR oslo_messaging.notify.messaging Dec 6 05:30:23 localhost ceilometer_agent_compute[195206]: 2025-12-06 10:30:23.161 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 6 05:30:23 localhost podman[197801]: time="2025-12-06T10:30:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:30:23 localhost podman[197801]: @ - - [06/Dec/2025:10:30:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:30:23 localhost podman[197801]: @ - - [06/Dec/2025:10:30:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15970 "" "Go-http-client/1.1" Dec 6 05:30:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25262 DF PROTO=TCP SPT=49590 DPT=9102 SEQ=3827791613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE4B60A0000000001030307) Dec 6 05:30:24 localhost nova_compute[237281]: 2025-12-06 10:30:24.371 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:24 localhost nova_compute[237281]: 2025-12-06 10:30:24.684 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25263 DF PROTO=TCP SPT=49590 DPT=9102 SEQ=3827791613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE4BA070000000001030307) Dec 6 05:30:25 localhost nova_compute[237281]: 2025-12-06 10:30:25.176 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:25.179 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '9e:6b:24', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:43:31:a8:52:41'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:30:25 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:25.180 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 6 05:30:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52349 DF PROTO=TCP SPT=38628 DPT=9102 SEQ=31544402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE4BD880000000001030307) Dec 6 05:30:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25264 DF PROTO=TCP SPT=49590 DPT=9102 SEQ=3827791613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE4C2070000000001030307) Dec 6 05:30:27 localhost nova_compute[237281]: 2025-12-06 10:30:27.381 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39592 DF PROTO=TCP SPT=43284 DPT=9102 SEQ=3546122672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE4C5870000000001030307) Dec 6 05:30:29 localhost nova_compute[237281]: 2025-12-06 10:30:29.688 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:30 localhost nova_compute[237281]: 2025-12-06 10:30:30.017 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:30:30 localhost podman[266901]: 2025-12-06 10:30:30.554368173 +0000 UTC m=+0.084811514 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 05:30:30 localhost podman[266901]: 2025-12-06 10:30:30.566665552 +0000 UTC m=+0.097108903 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7) Dec 6 05:30:30 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:30:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25265 DF PROTO=TCP SPT=49590 DPT=9102 SEQ=3827791613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE4D1C70000000001030307) Dec 6 05:30:32 localhost nova_compute[237281]: 2025-12-06 10:30:32.420 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:33 localhost nova_compute[237281]: 2025-12-06 10:30:33.199 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:30:34 localhost podman[266922]: 2025-12-06 10:30:34.546975093 +0000 UTC m=+0.081387308 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:30:34 localhost podman[266922]: 2025-12-06 10:30:34.580590148 +0000 UTC m=+0.115002373 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:30:34 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:30:34 localhost nova_compute[237281]: 2025-12-06 10:30:34.718 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:35 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:35.182 137259 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=a1cf5a35-de45-4f36-ac91-02296203a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 6 05:30:37 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:37.405 219384 INFO neutron.agent.linux.ip_lib [None req-d28cb6b4-d22f-4ea4-9193-fa9206f531a1 - - - - - -] Device tapd98ca0d9-f5 cannot be used as it has no MAC address#033[00m Dec 6 05:30:37 localhost nova_compute[237281]: 2025-12-06 10:30:37.450 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:37 localhost nova_compute[237281]: 2025-12-06 10:30:37.462 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:37 localhost kernel: device tapd98ca0d9-f5 entered promiscuous mode Dec 6 05:30:37 localhost NetworkManager[5965]: [1765017037.4723] manager: (tapd98ca0d9-f5): new Generic device (/org/freedesktop/NetworkManager/Devices/81) Dec 6 05:30:37 localhost nova_compute[237281]: 2025-12-06 10:30:37.475 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:37 localhost systemd-udevd[266956]: Network interface NamePolicy= disabled on kernel command line. Dec 6 05:30:37 localhost ovn_controller[131684]: 2025-12-06T10:30:37Z|00504|binding|INFO|Claiming lport d98ca0d9-f5d5-48ac-916f-eb2989df0a77 for this chassis. Dec 6 05:30:37 localhost ovn_controller[131684]: 2025-12-06T10:30:37Z|00505|binding|INFO|d98ca0d9-f5d5-48ac-916f-eb2989df0a77: Claiming unknown Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost ovn_controller[131684]: 2025-12-06T10:30:37Z|00506|binding|INFO|Setting lport d98ca0d9-f5d5-48ac-916f-eb2989df0a77 ovn-installed in OVS Dec 6 05:30:37 localhost nova_compute[237281]: 2025-12-06 10:30:37.511 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost journal[186952]: ethtool ioctl error on tapd98ca0d9-f5: No such device Dec 6 05:30:37 localhost nova_compute[237281]: 2025-12-06 10:30:37.553 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:37 localhost ovn_controller[131684]: 2025-12-06T10:30:37Z|00507|binding|INFO|Setting lport d98ca0d9-f5d5-48ac-916f-eb2989df0a77 up in Southbound Dec 6 05:30:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:37.571 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b8d3029b-a27f-4d99-b200-c12539392564', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8d3029b-a27f-4d99-b200-c12539392564', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33d3120e38944a1d8063ccfb79017453', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bd97a2e-35d5-439a-b8d1-f41b0ed87912, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d98ca0d9-f5d5-48ac-916f-eb2989df0a77) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:30:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:37.573 137259 INFO neutron.agent.ovn.metadata.agent [-] Port d98ca0d9-f5d5-48ac-916f-eb2989df0a77 in datapath b8d3029b-a27f-4d99-b200-c12539392564 bound to our chassis#033[00m Dec 6 05:30:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:37.575 137259 DEBUG neutron.agent.ovn.metadata.agent [-] Port f20a8312-0146-43eb-8404-9d6485ded936 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 6 05:30:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:37.575 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8d3029b-a27f-4d99-b200-c12539392564, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:30:37 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:37.576 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[e24314ba-355a-47c9-98ac-30352c97e1b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:30:37 localhost nova_compute[237281]: 2025-12-06 10:30:37.584 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:38 localhost podman[267027]: Dec 6 05:30:38 localhost podman[267027]: 2025-12-06 10:30:38.492578803 +0000 UTC m=+0.093110659 container create 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 6 05:30:38 localhost podman[267027]: 2025-12-06 10:30:38.44603857 +0000 UTC m=+0.046570456 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 6 05:30:38 localhost systemd[1]: Started libpod-conmon-0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa.scope. Dec 6 05:30:38 localhost systemd[1]: Started libcrun container. Dec 6 05:30:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67f764991b6731cbb1b0b2da615416c42c1cc1f2d31abb3f04d03dfa5fb7112d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 6 05:30:38 localhost podman[267027]: 2025-12-06 10:30:38.583775292 +0000 UTC m=+0.184307148 container init 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 6 05:30:38 localhost podman[267027]: 2025-12-06 10:30:38.593396938 +0000 UTC m=+0.193928794 container start 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 6 05:30:38 localhost dnsmasq[267046]: started, version 2.85 cachesize 150 Dec 6 05:30:38 localhost dnsmasq[267046]: DNS service limited to local subnets Dec 6 05:30:38 localhost dnsmasq[267046]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 6 05:30:38 localhost dnsmasq[267046]: warning: no upstream servers configured Dec 6 05:30:38 localhost dnsmasq-dhcp[267046]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 6 05:30:38 localhost dnsmasq[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/addn_hosts - 0 addresses Dec 6 05:30:38 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/host Dec 6 05:30:38 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/opts Dec 6 05:30:38 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:38.861 219384 INFO neutron.agent.dhcp.agent [None req-7967fb5f-1e2a-4505-be19-e5fb34fd2c7c - - - - - -] DHCP configuration for ports {'dc738aba-f261-43ac-aa7c-b105aa8f47d0'} is completed#033[00m Dec 6 05:30:39 localhost nova_compute[237281]: 2025-12-06 10:30:39.154 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25266 DF PROTO=TCP SPT=49590 DPT=9102 SEQ=3827791613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE4F1880000000001030307) Dec 6 05:30:39 localhost nova_compute[237281]: 2025-12-06 10:30:39.746 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:40.183 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:39Z, description=, device_id=7bb9b256-726c-4e8b-9369-27e08c3be16e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8a3ac5c2-109b-474b-b6a6-6b35cb11b585, ip_allocation=immediate, mac_address=fa:16:3e:1b:34:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:30:34Z, description=, dns_domain=, id=b8d3029b-a27f-4d99-b200-c12539392564, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1835111994-network, port_security_enabled=True, project_id=33d3120e38944a1d8063ccfb79017453, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22662, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2742, status=ACTIVE, subnets=['229274e4-326a-4bd1-9c5e-847d68c7726e'], tags=[], tenant_id=33d3120e38944a1d8063ccfb79017453, updated_at=2025-12-06T10:30:36Z, vlan_transparent=None, network_id=b8d3029b-a27f-4d99-b200-c12539392564, port_security_enabled=False, project_id=33d3120e38944a1d8063ccfb79017453, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2749, status=DOWN, tags=[], tenant_id=33d3120e38944a1d8063ccfb79017453, updated_at=2025-12-06T10:30:40Z on network b8d3029b-a27f-4d99-b200-c12539392564#033[00m Dec 6 05:30:40 localhost podman[267065]: 2025-12-06 10:30:40.399812414 +0000 UTC m=+0.060686540 container kill 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:30:40 localhost dnsmasq[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/addn_hosts - 1 addresses Dec 6 05:30:40 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/host Dec 6 05:30:40 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/opts Dec 6 05:30:40 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:40.875 219384 INFO neutron.agent.dhcp.agent [None req-08c1e5c9-3859-4fef-a79c-e115583adffa - - - - - -] DHCP configuration for ports {'8a3ac5c2-109b-474b-b6a6-6b35cb11b585'} is completed#033[00m Dec 6 05:30:41 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:41.852 219384 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:39Z, description=, device_id=7bb9b256-726c-4e8b-9369-27e08c3be16e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8a3ac5c2-109b-474b-b6a6-6b35cb11b585, ip_allocation=immediate, mac_address=fa:16:3e:1b:34:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:30:34Z, description=, dns_domain=, id=b8d3029b-a27f-4d99-b200-c12539392564, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1835111994-network, port_security_enabled=True, project_id=33d3120e38944a1d8063ccfb79017453, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22662, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2742, status=ACTIVE, subnets=['229274e4-326a-4bd1-9c5e-847d68c7726e'], tags=[], tenant_id=33d3120e38944a1d8063ccfb79017453, updated_at=2025-12-06T10:30:36Z, vlan_transparent=None, network_id=b8d3029b-a27f-4d99-b200-c12539392564, port_security_enabled=False, project_id=33d3120e38944a1d8063ccfb79017453, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2749, status=DOWN, tags=[], tenant_id=33d3120e38944a1d8063ccfb79017453, updated_at=2025-12-06T10:30:40Z on network b8d3029b-a27f-4d99-b200-c12539392564#033[00m Dec 6 05:30:42 localhost podman[267104]: 2025-12-06 10:30:42.061917645 +0000 UTC m=+0.050821536 container kill 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:30:42 localhost dnsmasq[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/addn_hosts - 1 addresses Dec 6 05:30:42 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/host Dec 6 05:30:42 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/opts Dec 6 05:30:42 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:42.347 219384 INFO neutron.agent.dhcp.agent [None req-e56cd394-32da-4dbb-a94f-abde22b02720 - - - - - -] DHCP configuration for ports {'8a3ac5c2-109b-474b-b6a6-6b35cb11b585'} is completed#033[00m Dec 6 05:30:42 localhost nova_compute[237281]: 2025-12-06 10:30:42.490 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:30:44 localhost podman[267126]: 2025-12-06 10:30:44.566272666 +0000 UTC m=+0.097588176 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 6 05:30:44 localhost podman[267126]: 2025-12-06 10:30:44.616667167 +0000 UTC m=+0.147982667 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:30:44 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:30:44 localhost nova_compute[237281]: 2025-12-06 10:30:44.747 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:46 localhost openstack_network_exporter[199751]: ERROR 10:30:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:30:46 localhost openstack_network_exporter[199751]: ERROR 10:30:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:30:46 localhost openstack_network_exporter[199751]: ERROR 10:30:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:30:46 localhost openstack_network_exporter[199751]: ERROR 10:30:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:30:46 localhost openstack_network_exporter[199751]: Dec 6 05:30:46 localhost openstack_network_exporter[199751]: ERROR 10:30:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:30:46 localhost openstack_network_exporter[199751]: Dec 6 05:30:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:30:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:30:46 localhost podman[267152]: 2025-12-06 10:30:46.554648099 +0000 UTC m=+0.086677987 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:30:46 localhost podman[267152]: 2025-12-06 10:30:46.562160682 +0000 UTC m=+0.094190520 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Dec 6 05:30:46 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:30:46 localhost podman[267153]: 2025-12-06 10:30:46.525700592 +0000 UTC m=+0.059744613 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:30:46 localhost podman[267153]: 2025-12-06 10:30:46.605160654 +0000 UTC m=+0.139204705 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:30:46 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:30:47 localhost nova_compute[237281]: 2025-12-06 10:30:47.492 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:49 localhost nova_compute[237281]: 2025-12-06 10:30:49.750 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:50 localhost ovn_controller[131684]: 2025-12-06T10:30:50Z|00508|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:30:50 localhost nova_compute[237281]: 2025-12-06 10:30:50.692 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:30:51 localhost podman[267196]: 2025-12-06 10:30:51.545643143 +0000 UTC m=+0.082735985 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 6 05:30:51 localhost podman[267196]: 2025-12-06 10:30:51.579624076 +0000 UTC m=+0.116716918 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 6 05:30:51 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:30:51 localhost podman[267197]: 2025-12-06 10:30:51.599955846 +0000 UTC m=+0.134213530 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Dec 6 05:30:51 localhost podman[267197]: 2025-12-06 10:30:51.638574562 +0000 UTC m=+0.172832246 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:30:51 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:30:52 localhost nova_compute[237281]: 2025-12-06 10:30:52.512 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:52 localhost dnsmasq[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/addn_hosts - 0 addresses Dec 6 05:30:52 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/host Dec 6 05:30:52 localhost podman[267249]: 2025-12-06 10:30:52.702376316 +0000 UTC m=+0.060211467 container kill 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 6 05:30:52 localhost dnsmasq-dhcp[267046]: read /var/lib/neutron/dhcp/b8d3029b-a27f-4d99-b200-c12539392564/opts Dec 6 05:30:52 localhost kernel: device tapd98ca0d9-f5 left promiscuous mode Dec 6 05:30:52 localhost nova_compute[237281]: 2025-12-06 10:30:52.993 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:52 localhost ovn_controller[131684]: 2025-12-06T10:30:52Z|00509|binding|INFO|Releasing lport d98ca0d9-f5d5-48ac-916f-eb2989df0a77 from this chassis (sb_readonly=0) Dec 6 05:30:52 localhost ovn_controller[131684]: 2025-12-06T10:30:52Z|00510|binding|INFO|Setting lport d98ca0d9-f5d5-48ac-916f-eb2989df0a77 down in Southbound Dec 6 05:30:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:53.012 137259 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548798.ooo.test'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpf60eb315-a853-5311-b825-b2165cc18d77-b8d3029b-a27f-4d99-b200-c12539392564', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8d3029b-a27f-4d99-b200-c12539392564', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33d3120e38944a1d8063ccfb79017453', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548798.ooo.test'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3bd97a2e-35d5-439a-b8d1-f41b0ed87912, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d98ca0d9-f5d5-48ac-916f-eb2989df0a77) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 6 05:30:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:53.014 137259 INFO neutron.agent.ovn.metadata.agent [-] Port d98ca0d9-f5d5-48ac-916f-eb2989df0a77 in datapath b8d3029b-a27f-4d99-b200-c12539392564 unbound from our chassis#033[00m Dec 6 05:30:53 localhost nova_compute[237281]: 2025-12-06 10:30:53.015 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:53.018 137259 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8d3029b-a27f-4d99-b200-c12539392564, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 6 05:30:53 localhost ovn_metadata_agent[137254]: 2025-12-06 10:30:53.020 137360 DEBUG oslo.privsep.daemon [-] privsep: reply[b618da83-01bc-4591-bf33-dd7d7eb568fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 6 05:30:53 localhost podman[197801]: time="2025-12-06T10:30:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:30:53 localhost podman[197801]: @ - - [06/Dec/2025:10:30:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 145974 "" "Go-http-client/1.1" Dec 6 05:30:53 localhost podman[197801]: @ - - [06/Dec/2025:10:30:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16435 "" "Go-http-client/1.1" Dec 6 05:30:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15829 DF PROTO=TCP SPT=43076 DPT=9102 SEQ=704409559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE52B3E0000000001030307) Dec 6 05:30:54 localhost ovn_controller[131684]: 2025-12-06T10:30:54Z|00511|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:30:54 localhost nova_compute[237281]: 2025-12-06 10:30:54.438 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:54 localhost nova_compute[237281]: 2025-12-06 10:30:54.752 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:54 localhost nova_compute[237281]: 2025-12-06 10:30:54.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15830 DF PROTO=TCP SPT=43076 DPT=9102 SEQ=704409559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE52F480000000001030307) Dec 6 05:30:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25267 DF PROTO=TCP SPT=49590 DPT=9102 SEQ=3827791613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE531880000000001030307) Dec 6 05:30:55 localhost nova_compute[237281]: 2025-12-06 10:30:55.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:30:57 localhost podman[267288]: 2025-12-06 10:30:57.038118735 +0000 UTC m=+0.058041919 container kill 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:30:57 localhost dnsmasq[267046]: exiting on receipt of SIGTERM Dec 6 05:30:57 localhost systemd[1]: libpod-0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa.scope: Deactivated successfully. Dec 6 05:30:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15831 DF PROTO=TCP SPT=43076 DPT=9102 SEQ=704409559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE537470000000001030307) Dec 6 05:30:57 localhost podman[267303]: 2025-12-06 10:30:57.10505365 +0000 UTC m=+0.053099007 container died 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 6 05:30:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa-userdata-shm.mount: Deactivated successfully. Dec 6 05:30:57 localhost podman[267303]: 2025-12-06 10:30:57.133629645 +0000 UTC m=+0.081674952 container cleanup 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:30:57 localhost systemd[1]: libpod-conmon-0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa.scope: Deactivated successfully. Dec 6 05:30:57 localhost podman[267304]: 2025-12-06 10:30:57.167445174 +0000 UTC m=+0.111517767 container remove 0cbceb47a2aabb33b29e48c133b43783442438809fa07cc292f84ec8a2fdadfa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b8d3029b-a27f-4d99-b200-c12539392564, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3) Dec 6 05:30:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:57.421 219384 INFO neutron.agent.dhcp.agent [None req-6c3b3ed9-2e7c-477c-82d2-2bec14f1b67d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:30:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:57.432 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:30:57 localhost nova_compute[237281]: 2025-12-06 10:30:57.556 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:57 localhost neutron_dhcp_agent[219380]: 2025-12-06 10:30:57.631 219384 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 6 05:30:58 localhost systemd[1]: var-lib-containers-storage-overlay-67f764991b6731cbb1b0b2da615416c42c1cc1f2d31abb3f04d03dfa5fb7112d-merged.mount: Deactivated successfully. Dec 6 05:30:58 localhost systemd[1]: run-netns-qdhcp\x2db8d3029b\x2da27f\x2d4d99\x2db200\x2dc12539392564.mount: Deactivated successfully. Dec 6 05:30:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52350 DF PROTO=TCP SPT=38628 DPT=9102 SEQ=31544402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE53B880000000001030307) Dec 6 05:30:59 localhost nova_compute[237281]: 2025-12-06 10:30:59.756 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:30:59 localhost nova_compute[237281]: 2025-12-06 10:30:59.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:00 localhost nova_compute[237281]: 2025-12-06 10:31:00.882 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15832 DF PROTO=TCP SPT=43076 DPT=9102 SEQ=704409559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE547070000000001030307) Dec 6 05:31:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:31:01 localhost podman[267332]: 2025-12-06 10:31:01.559108586 +0000 UTC m=+0.091997132 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.) Dec 6 05:31:01 localhost podman[267332]: 2025-12-06 10:31:01.600301323 +0000 UTC m=+0.133189859 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 6 05:31:01 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:31:01 localhost nova_compute[237281]: 2025-12-06 10:31:01.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:02 localhost nova_compute[237281]: 2025-12-06 10:31:02.560 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:03 localhost nova_compute[237281]: 2025-12-06 10:31:03.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:03 localhost nova_compute[237281]: 2025-12-06 10:31:03.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:03 localhost nova_compute[237281]: 2025-12-06 10:31:03.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:31:04 localhost nova_compute[237281]: 2025-12-06 10:31:04.777 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:04 localhost nova_compute[237281]: 2025-12-06 10:31:04.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:04 localhost nova_compute[237281]: 2025-12-06 10:31:04.887 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:31:04 localhost nova_compute[237281]: 2025-12-06 10:31:04.888 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:31:05 localhost nova_compute[237281]: 2025-12-06 10:31:05.209 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:31:05 localhost nova_compute[237281]: 2025-12-06 10:31:05.210 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:31:05 localhost nova_compute[237281]: 2025-12-06 10:31:05.211 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:31:05 localhost nova_compute[237281]: 2025-12-06 10:31:05.212 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:31:05 localhost podman[267350]: 2025-12-06 10:31:05.566493971 +0000 UTC m=+0.093596752 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 6 05:31:05 localhost podman[267350]: 2025-12-06 10:31:05.578375009 +0000 UTC m=+0.105477810 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:31:05 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:31:06.715 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:31:06.716 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:31:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:31:06.717 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:31:07 localhost nova_compute[237281]: 2025-12-06 10:31:07.403 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:07 localhost nova_compute[237281]: 2025-12-06 10:31:07.562 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:07 localhost nova_compute[237281]: 2025-12-06 10:31:07.701 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:31:07 localhost nova_compute[237281]: 2025-12-06 10:31:07.730 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:31:07 localhost nova_compute[237281]: 2025-12-06 10:31:07.730 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:31:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15833 DF PROTO=TCP SPT=43076 DPT=9102 SEQ=704409559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE567870000000001030307) Dec 6 05:31:09 localhost nova_compute[237281]: 2025-12-06 10:31:09.779 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:09 localhost nova_compute[237281]: 2025-12-06 10:31:09.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.329 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.330 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.331 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.331 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.458 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.474 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.531 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.532 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.592 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.594 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.668 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.670 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.741 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk.eph0 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.969 237285 WARNING nova.virt.libvirt.driver [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.971 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Hypervisor/Node resource view: name=np0005548798.ooo.test free_ram=12261MB free_disk=387.26367950439453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.972 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:31:10 localhost nova_compute[237281]: 2025-12-06 10:31:10.972 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:31:11 localhost nova_compute[237281]: 2025-12-06 10:31:11.069 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 6 05:31:11 localhost nova_compute[237281]: 2025-12-06 10:31:11.070 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 6 05:31:11 localhost nova_compute[237281]: 2025-12-06 10:31:11.071 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Final resource view: name=np0005548798.ooo.test phys_ram=15738MB used_ram=1024MB phys_disk=399GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 6 05:31:11 localhost nova_compute[237281]: 2025-12-06 10:31:11.134 237285 DEBUG nova.compute.provider_tree [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed in ProviderTree for provider: db8b39ad-af52-43e3-99e2-f3c431f03241 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 6 05:31:11 localhost nova_compute[237281]: 2025-12-06 10:31:11.150 237285 DEBUG nova.scheduler.client.report [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Inventory has not changed for provider db8b39ad-af52-43e3-99e2-f3c431f03241 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 399, 'reserved': 1, 'min_unit': 1, 'max_unit': 399, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 6 05:31:11 localhost nova_compute[237281]: 2025-12-06 10:31:11.152 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Compute_service record updated for np0005548798.ooo.test:np0005548798.ooo.test _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 6 05:31:11 localhost nova_compute[237281]: 2025-12-06 10:31:11.153 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:31:12 localhost nova_compute[237281]: 2025-12-06 10:31:12.601 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:14 localhost nova_compute[237281]: 2025-12-06 10:31:14.816 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:31:15 localhost podman[267387]: 2025-12-06 10:31:15.543451331 +0000 UTC m=+0.072458326 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller) Dec 6 05:31:15 localhost podman[267387]: 2025-12-06 10:31:15.587210488 +0000 UTC m=+0.116217513 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:31:15 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:31:16 localhost openstack_network_exporter[199751]: ERROR 10:31:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:31:16 localhost openstack_network_exporter[199751]: ERROR 10:31:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:31:16 localhost openstack_network_exporter[199751]: ERROR 10:31:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:31:16 localhost openstack_network_exporter[199751]: ERROR 10:31:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:31:16 localhost openstack_network_exporter[199751]: Dec 6 05:31:16 localhost openstack_network_exporter[199751]: ERROR 10:31:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:31:16 localhost openstack_network_exporter[199751]: Dec 6 05:31:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:31:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:31:17 localhost systemd[1]: tmp-crun.WlZKHQ.mount: Deactivated successfully. Dec 6 05:31:17 localhost podman[267414]: 2025-12-06 10:31:17.551512265 +0000 UTC m=+0.079154214 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 6 05:31:17 localhost podman[267414]: 2025-12-06 10:31:17.59137343 +0000 UTC m=+0.119015349 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 6 05:31:17 localhost nova_compute[237281]: 2025-12-06 10:31:17.604 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:17 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:31:17 localhost podman[267413]: 2025-12-06 10:31:17.61525827 +0000 UTC m=+0.142719123 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 6 05:31:17 localhost podman[267413]: 2025-12-06 10:31:17.628300864 +0000 UTC m=+0.155761727 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:31:17 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:31:19 localhost nova_compute[237281]: 2025-12-06 10:31:19.820 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:19 localhost ovn_controller[131684]: 2025-12-06T10:31:19Z|00512|binding|INFO|Releasing lport dc760542-e03f-4d48-a573-fabb89636a57 from this chassis (sb_readonly=0) Dec 6 05:31:19 localhost nova_compute[237281]: 2025-12-06 10:31:19.977 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:31:22 localhost podman[267457]: 2025-12-06 10:31:22.551032713 +0000 UTC m=+0.081701764 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:31:22 localhost nova_compute[237281]: 2025-12-06 10:31:22.605 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:22 localhost podman[267458]: 2025-12-06 10:31:22.609604487 +0000 UTC m=+0.136033976 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:31:22 localhost podman[267458]: 2025-12-06 10:31:22.618591996 +0000 UTC m=+0.145021515 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 05:31:22 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:31:22 localhost podman[267457]: 2025-12-06 10:31:22.638669258 +0000 UTC m=+0.169338259 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent) Dec 6 05:31:22 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:31:23 localhost podman[197801]: time="2025-12-06T10:31:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:31:23 localhost podman[197801]: @ - - [06/Dec/2025:10:31:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:31:23 localhost podman[197801]: @ - - [06/Dec/2025:10:31:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15972 "" "Go-http-client/1.1" Dec 6 05:31:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12653 DF PROTO=TCP SPT=38788 DPT=9102 SEQ=932515949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE5A06A0000000001030307) Dec 6 05:31:24 localhost nova_compute[237281]: 2025-12-06 10:31:24.867 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12654 DF PROTO=TCP SPT=38788 DPT=9102 SEQ=932515949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE5A4880000000001030307) Dec 6 05:31:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15834 DF PROTO=TCP SPT=43076 DPT=9102 SEQ=704409559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE5A7870000000001030307) Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.888 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.891 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use..do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.892 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use..do_register_storage_use" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.893 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use..do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.893 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users..do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.894 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users..do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.894 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users..do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.926 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.947 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.947 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Image id c6562616-bf77-48e6-bb05-431e64af083a yields fingerprint 3e070c3db7ba7309de3805d58aaf4369c4bd45c2 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.948 237285 INFO nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] image c6562616-bf77-48e6-bb05-431e64af083a at (/var/lib/nova/instances/_base/3e070c3db7ba7309de3805d58aaf4369c4bd45c2): checking#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.948 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] image c6562616-bf77-48e6-bb05-431e64af083a at (/var/lib/nova/instances/_base/3e070c3db7ba7309de3805d58aaf4369c4bd45c2): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.951 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Image id yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.952 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] a5070ada-6b60-4992-a1bf-9e83aaccac93 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.952 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] a5070ada-6b60-4992-a1bf-9e83aaccac93 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129#033[00m Dec 6 05:31:25 localhost nova_compute[237281]: 2025-12-06 10:31:25.953 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.033 237285 DEBUG oslo_concurrency.processutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a5070ada-6b60-4992-a1bf-9e83aaccac93/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.034 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Instance a5070ada-6b60-4992-a1bf-9e83aaccac93 is backed by 3e070c3db7ba7309de3805d58aaf4369c4bd45c2 _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.035 237285 WARNING nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.036 237285 INFO nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Active base files: /var/lib/nova/instances/_base/3e070c3db7ba7309de3805d58aaf4369c4bd45c2#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.036 237285 INFO nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Removable base files: /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.037 237285 INFO nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/7ed36996b83444bfa83969c1e5caf9794500f5d3#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.037 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.038 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m Dec 6 05:31:26 localhost nova_compute[237281]: 2025-12-06 10:31:26.038 237285 DEBUG nova.virt.libvirt.imagecache [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m Dec 6 05:31:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12655 DF PROTO=TCP SPT=38788 DPT=9102 SEQ=932515949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE5AC870000000001030307) Dec 6 05:31:27 localhost nova_compute[237281]: 2025-12-06 10:31:27.609 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25268 DF PROTO=TCP SPT=49590 DPT=9102 SEQ=3827791613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE5AF880000000001030307) Dec 6 05:31:29 localhost nova_compute[237281]: 2025-12-06 10:31:29.872 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12656 DF PROTO=TCP SPT=38788 DPT=9102 SEQ=932515949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE5BC470000000001030307) Dec 6 05:31:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:31:32 localhost podman[267496]: 2025-12-06 10:31:32.559474679 +0000 UTC m=+0.085942773 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7) Dec 6 05:31:32 localhost podman[267496]: 2025-12-06 10:31:32.576828047 +0000 UTC m=+0.103296181 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 05:31:32 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:31:32 localhost nova_compute[237281]: 2025-12-06 10:31:32.626 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:34 localhost nova_compute[237281]: 2025-12-06 10:31:34.893 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:31:36 localhost podman[267516]: 2025-12-06 10:31:36.547948378 +0000 UTC m=+0.080306880 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:31:36 localhost podman[267516]: 2025-12-06 10:31:36.555389989 +0000 UTC m=+0.087748481 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:31:36 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:31:37 localhost nova_compute[237281]: 2025-12-06 10:31:37.674 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12657 DF PROTO=TCP SPT=38788 DPT=9102 SEQ=932515949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE5DD880000000001030307) Dec 6 05:31:39 localhost nova_compute[237281]: 2025-12-06 10:31:39.894 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:41 localhost sshd[267540]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:31:42 localhost nova_compute[237281]: 2025-12-06 10:31:42.683 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:44 localhost nova_compute[237281]: 2025-12-06 10:31:44.937 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:46 localhost openstack_network_exporter[199751]: ERROR 10:31:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 6 05:31:46 localhost openstack_network_exporter[199751]: ERROR 10:31:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:31:46 localhost openstack_network_exporter[199751]: ERROR 10:31:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 6 05:31:46 localhost openstack_network_exporter[199751]: ERROR 10:31:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 6 05:31:46 localhost openstack_network_exporter[199751]: Dec 6 05:31:46 localhost openstack_network_exporter[199751]: ERROR 10:31:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 6 05:31:46 localhost openstack_network_exporter[199751]: Dec 6 05:31:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436. Dec 6 05:31:46 localhost podman[267542]: 2025-12-06 10:31:46.553767853 +0000 UTC m=+0.086207232 container health_status da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 6 05:31:46 localhost podman[267542]: 2025-12-06 10:31:46.618380395 +0000 UTC m=+0.150819804 container exec_died da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3) Dec 6 05:31:46 localhost systemd[1]: da7f19151de0ae7d69ee481949dd946df4f773a2a35285f45f813b43c5e80436.service: Deactivated successfully. Dec 6 05:31:47 localhost nova_compute[237281]: 2025-12-06 10:31:47.722 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:48 localhost sshd[267567]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:31:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17. Dec 6 05:31:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b. Dec 6 05:31:48 localhost podman[267569]: 2025-12-06 10:31:48.558964907 +0000 UTC m=+0.082224578 container health_status b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 6 05:31:48 localhost podman[267568]: 2025-12-06 10:31:48.60810266 +0000 UTC m=+0.134502519 container health_status 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 6 05:31:48 localhost podman[267569]: 2025-12-06 10:31:48.625945423 +0000 UTC m=+0.149205064 container exec_died b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 6 05:31:48 localhost systemd[1]: b4920f973591be6ed2afd243f860c079ec264bf2e94fd63687160526507d7e3b.service: Deactivated successfully. Dec 6 05:31:48 localhost podman[267568]: 2025-12-06 10:31:48.640774283 +0000 UTC m=+0.167174172 container exec_died 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Dec 6 05:31:48 localhost systemd[1]: 4b497348a877d61b6a65740ab45217f893e38cb7c688aabdf82517e9ab348e17.service: Deactivated successfully. Dec 6 05:31:49 localhost nova_compute[237281]: 2025-12-06 10:31:49.940 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:51 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=38.102.83.114 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14056 DF PROTO=TCP SPT=34364 DPT=19885 SEQ=991471865 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080AF3F919B50000000001030307) Dec 6 05:31:52 localhost sshd[267607]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:31:52 localhost systemd-logind[760]: New session 50 of user zuul. Dec 6 05:31:52 localhost systemd[1]: Started Session 50 of User zuul. Dec 6 05:31:52 localhost python3[267629]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163efc-24cc-5857-7eb1-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 6 05:31:52 localhost nova_compute[237281]: 2025-12-06 10:31:52.757 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:52 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=38.102.83.114 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14057 DF PROTO=TCP SPT=34364 DPT=19885 SEQ=991471865 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080AF3F91DAD0000000001030307) Dec 6 05:31:53 localhost podman[197801]: time="2025-12-06T10:31:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 6 05:31:53 localhost podman[197801]: @ - - [06/Dec/2025:10:31:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144150 "" "Go-http-client/1.1" Dec 6 05:31:53 localhost podman[197801]: @ - - [06/Dec/2025:10:31:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15981 "" "Go-http-client/1.1" Dec 6 05:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3. Dec 6 05:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509. Dec 6 05:31:53 localhost podman[267633]: 2025-12-06 10:31:53.551994814 +0000 UTC m=+0.082922341 container health_status e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 6 05:31:53 localhost systemd[1]: tmp-crun.fKOwDp.mount: Deactivated successfully. Dec 6 05:31:53 localhost podman[267632]: 2025-12-06 10:31:53.608044751 +0000 UTC m=+0.140728182 container health_status 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd) Dec 6 05:31:53 localhost podman[267633]: 2025-12-06 10:31:53.636659837 +0000 UTC m=+0.167587334 container exec_died e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS) Dec 6 05:31:53 localhost systemd[1]: e93ad586c8609b06b324017c3420f39e0a2acaf84a421e939d2fd2b6d62b2509.service: Deactivated successfully. Dec 6 05:31:53 localhost podman[267632]: 2025-12-06 10:31:53.687813132 +0000 UTC m=+0.220496493 container exec_died 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 6 05:31:53 localhost systemd[1]: 34c95f9bfd699b1c32f56a3a74a17d217ff79fc856429b1791cc9aeeb9eea7c3.service: Deactivated successfully. Dec 6 05:31:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58597 DF PROTO=TCP SPT=39004 DPT=9102 SEQ=534600490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE615990000000001030307) Dec 6 05:31:54 localhost nova_compute[237281]: 2025-12-06 10:31:54.983 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58598 DF PROTO=TCP SPT=39004 DPT=9102 SEQ=534600490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE619870000000001030307) Dec 6 05:31:55 localhost nova_compute[237281]: 2025-12-06 10:31:55.038 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:55 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:df:39:8d MACPROTO=0800 SRC=38.102.83.114 DST=38.129.56.147 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14058 DF PROTO=TCP SPT=34364 DPT=19885 SEQ=991471865 ACK=0 WINDOW=32120 RES=0x00 SYN URGP=0 OPT (020405B40402080AF3F925AD0000000001030307) Dec 6 05:31:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12658 DF PROTO=TCP SPT=38788 DPT=9102 SEQ=932515949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE61D880000000001030307) Dec 6 05:31:56 localhost nova_compute[237281]: 2025-12-06 10:31:56.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58599 DF PROTO=TCP SPT=39004 DPT=9102 SEQ=534600490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE621870000000001030307) Dec 6 05:31:57 localhost ovn_controller[131684]: 2025-12-06T10:31:57Z|00513|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory Dec 6 05:31:57 localhost nova_compute[237281]: 2025-12-06 10:31:57.791 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:31:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15835 DF PROTO=TCP SPT=43076 DPT=9102 SEQ=704409559 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE625870000000001030307) Dec 6 05:31:58 localhost systemd[1]: session-50.scope: Deactivated successfully. Dec 6 05:31:58 localhost systemd-logind[760]: Session 50 logged out. Waiting for processes to exit. Dec 6 05:31:58 localhost systemd-logind[760]: Removed session 50. Dec 6 05:31:59 localhost nova_compute[237281]: 2025-12-06 10:31:59.886 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:31:59 localhost nova_compute[237281]: 2025-12-06 10:31:59.986 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:32:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58600 DF PROTO=TCP SPT=39004 DPT=9102 SEQ=534600490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE631470000000001030307) Dec 6 05:32:01 localhost nova_compute[237281]: 2025-12-06 10:32:01.882 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:01 localhost nova_compute[237281]: 2025-12-06 10:32:01.884 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:02 localhost nova_compute[237281]: 2025-12-06 10:32:02.841 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:32:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4. Dec 6 05:32:03 localhost podman[267670]: 2025-12-06 10:32:03.248301719 +0000 UTC m=+0.089289558 container health_status 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9) Dec 6 05:32:03 localhost podman[267670]: 2025-12-06 10:32:03.261804467 +0000 UTC m=+0.102792296 container exec_died 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers) Dec 6 05:32:03 localhost systemd[1]: 0868ce3ff5de2d8034755185f9a7aedf6a630a168a764b0c2cd51b0fc5a0c3d4.service: Deactivated successfully. Dec 6 05:32:03 localhost nova_compute[237281]: 2025-12-06 10:32:03.885 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:03 localhost nova_compute[237281]: 2025-12-06 10:32:03.886 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 6 05:32:03 localhost nova_compute[237281]: 2025-12-06 10:32:03.887 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:05 localhost nova_compute[237281]: 2025-12-06 10:32:05.042 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:32:05 localhost nova_compute[237281]: 2025-12-06 10:32:05.904 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:05 localhost nova_compute[237281]: 2025-12-06 10:32:05.905 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 6 05:32:05 localhost nova_compute[237281]: 2025-12-06 10:32:05.905 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 6 05:32:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:32:06.716 137259 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:32:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:32:06.717 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:32:06 localhost ovn_metadata_agent[137254]: 2025-12-06 10:32:06.718 137259 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:32:07 localhost nova_compute[237281]: 2025-12-06 10:32:07.128 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 6 05:32:07 localhost nova_compute[237281]: 2025-12-06 10:32:07.128 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquired lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 6 05:32:07 localhost nova_compute[237281]: 2025-12-06 10:32:07.128 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 6 05:32:07 localhost nova_compute[237281]: 2025-12-06 10:32:07.129 237285 DEBUG nova.objects.instance [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a5070ada-6b60-4992-a1bf-9e83aaccac93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 6 05:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1. Dec 6 05:32:07 localhost podman[267692]: 2025-12-06 10:32:07.544621168 +0000 UTC m=+0.075584803 container health_status 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 6 05:32:07 localhost podman[267692]: 2025-12-06 10:32:07.55631434 +0000 UTC m=+0.087277985 container exec_died 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Dec 6 05:32:07 localhost systemd[1]: 979338aa565ec66588777a979371ece84bd0dc24c61c411720944628061d71a1.service: Deactivated successfully. Dec 6 05:32:07 localhost nova_compute[237281]: 2025-12-06 10:32:07.882 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:32:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:c8:cb:da MACDST=fa:16:3e:33:96:90 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58601 DF PROTO=TCP SPT=39004 DPT=9102 SEQ=534600490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1DE651870000000001030307) Dec 6 05:32:10 localhost nova_compute[237281]: 2025-12-06 10:32:10.044 237285 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.376 237285 DEBUG nova.network.neutron [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updating instance_info_cache with network_info: [{"id": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "address": "fa:16:3e:91:02:64", "network": {"id": "20509a6a-c438-4c5e-82a7-fe0ea272b309", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.189", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "47835b89168945138751a4b216280589", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap227fe5b2-a5", "ovs_interfaceid": "227fe5b2-a5a7-4043-b641-32b6e7c7a7c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.400 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Releasing lock "refresh_cache-a5070ada-6b60-4992-a1bf-9e83aaccac93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.401 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] [instance: a5070ada-6b60-4992-a1bf-9e83aaccac93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.401 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.402 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.402 237285 DEBUG nova.compute.manager [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 6 05:32:11 localhost sshd[267716]: main: sshd: ssh-rsa algorithm is disabled Dec 6 05:32:11 localhost systemd-logind[760]: New session 51 of user zuul. Dec 6 05:32:11 localhost systemd[1]: Started Session 51 of User zuul. Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.902 237285 DEBUG oslo_service.periodic_task [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.935 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.935 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.936 237285 DEBUG oslo_concurrency.lockutils [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 6 05:32:11 localhost nova_compute[237281]: 2025-12-06 10:32:11.936 237285 DEBUG nova.compute.resource_tracker [None req-dad6aa67-d7c3-4e02-bf94-ec2c5a752e89 - - - - - -] Auditing locally available compute resources for np0005548798.ooo.test (node: np0005548798.ooo.test) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m